May 13 00:24:41.886063 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon May 12 22:46:21 -00 2025 May 13 00:24:41.886084 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=a30636f72ddb6c7dc7c9bee07b7cf23b403029ba1ff64eed2705530c62c7b592 May 13 00:24:41.886096 kernel: BIOS-provided physical RAM map: May 13 00:24:41.886102 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 13 00:24:41.886108 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 13 00:24:41.886114 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 13 00:24:41.886121 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable May 13 00:24:41.886128 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved May 13 00:24:41.886134 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved May 13 00:24:41.886142 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved May 13 00:24:41.886148 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 13 00:24:41.886154 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 13 00:24:41.886160 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 13 00:24:41.886167 kernel: NX (Execute Disable) protection: active May 13 00:24:41.886174 kernel: APIC: Static calls initialized May 13 00:24:41.886183 kernel: SMBIOS 2.8 present. May 13 00:24:41.886190 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 May 13 00:24:41.886197 kernel: Hypervisor detected: KVM May 13 00:24:41.886203 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 13 00:24:41.886210 kernel: kvm-clock: using sched offset of 2235542870 cycles May 13 00:24:41.886217 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 13 00:24:41.886224 kernel: tsc: Detected 2794.748 MHz processor May 13 00:24:41.886231 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 13 00:24:41.886238 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 13 00:24:41.886245 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 May 13 00:24:41.886254 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 13 00:24:41.886261 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 13 00:24:41.886268 kernel: Using GB pages for direct mapping May 13 00:24:41.886286 kernel: ACPI: Early table checksum verification disabled May 13 00:24:41.886293 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) May 13 00:24:41.886300 kernel: ACPI: RSDT 0x000000009CFE2408 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 00:24:41.886307 kernel: ACPI: FACP 0x000000009CFE21E8 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 13 00:24:41.886313 kernel: ACPI: DSDT 0x000000009CFE0040 0021A8 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 00:24:41.886323 kernel: ACPI: FACS 0x000000009CFE0000 000040 May 13 00:24:41.886330 kernel: ACPI: APIC 0x000000009CFE22DC 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 00:24:41.886337 kernel: ACPI: HPET 0x000000009CFE236C 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 00:24:41.886344 kernel: ACPI: MCFG 0x000000009CFE23A4 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 00:24:41.886350 kernel: ACPI: WAET 0x000000009CFE23E0 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 00:24:41.886357 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21e8-0x9cfe22db] May 13 00:24:41.886364 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21e7] May 13 00:24:41.886374 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] May 13 00:24:41.886384 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22dc-0x9cfe236b] May 13 00:24:41.886391 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe236c-0x9cfe23a3] May 13 00:24:41.886398 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23a4-0x9cfe23df] May 13 00:24:41.886405 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23e0-0x9cfe2407] May 13 00:24:41.886412 kernel: No NUMA configuration found May 13 00:24:41.886419 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] May 13 00:24:41.886426 kernel: NODE_DATA(0) allocated [mem 0x9cfd6000-0x9cfdbfff] May 13 00:24:41.886436 kernel: Zone ranges: May 13 00:24:41.886443 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 13 00:24:41.886450 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] May 13 00:24:41.886457 kernel: Normal empty May 13 00:24:41.886464 kernel: Movable zone start for each node May 13 00:24:41.886471 kernel: Early memory node ranges May 13 00:24:41.886478 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 13 00:24:41.886485 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] May 13 00:24:41.886492 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] May 13 00:24:41.886502 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 13 00:24:41.886509 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 13 00:24:41.886516 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges May 13 00:24:41.886523 kernel: ACPI: PM-Timer IO Port: 0x608 May 13 00:24:41.886530 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 13 00:24:41.886537 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 13 00:24:41.886544 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 13 00:24:41.886551 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 13 00:24:41.886558 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 13 00:24:41.886567 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 13 00:24:41.886575 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 13 00:24:41.886582 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 13 00:24:41.886589 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 13 00:24:41.886596 kernel: TSC deadline timer available May 13 00:24:41.886603 kernel: smpboot: Allowing 4 CPUs, 0 hotplug CPUs May 13 00:24:41.886610 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 13 00:24:41.886617 kernel: kvm-guest: KVM setup pv remote TLB flush May 13 00:24:41.886624 kernel: kvm-guest: setup PV sched yield May 13 00:24:41.886634 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices May 13 00:24:41.886641 kernel: Booting paravirtualized kernel on KVM May 13 00:24:41.886648 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 13 00:24:41.886656 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 May 13 00:24:41.886663 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u524288 May 13 00:24:41.886670 kernel: pcpu-alloc: s197032 r8192 d32344 u524288 alloc=1*2097152 May 13 00:24:41.886677 kernel: pcpu-alloc: [0] 0 1 2 3 May 13 00:24:41.886684 kernel: kvm-guest: PV spinlocks enabled May 13 00:24:41.886691 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 13 00:24:41.886699 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=a30636f72ddb6c7dc7c9bee07b7cf23b403029ba1ff64eed2705530c62c7b592 May 13 00:24:41.886709 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 13 00:24:41.886716 kernel: random: crng init done May 13 00:24:41.886723 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 13 00:24:41.886731 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 13 00:24:41.886738 kernel: Fallback order for Node 0: 0 May 13 00:24:41.886745 kernel: Built 1 zonelists, mobility grouping on. Total pages: 632732 May 13 00:24:41.886752 kernel: Policy zone: DMA32 May 13 00:24:41.886759 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 13 00:24:41.886769 kernel: Memory: 2434592K/2571752K available (12288K kernel code, 2295K rwdata, 22740K rodata, 42864K init, 2328K bss, 136900K reserved, 0K cma-reserved) May 13 00:24:41.886776 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 May 13 00:24:41.886783 kernel: ftrace: allocating 37944 entries in 149 pages May 13 00:24:41.886790 kernel: ftrace: allocated 149 pages with 4 groups May 13 00:24:41.886797 kernel: Dynamic Preempt: voluntary May 13 00:24:41.886805 kernel: rcu: Preemptible hierarchical RCU implementation. May 13 00:24:41.886812 kernel: rcu: RCU event tracing is enabled. May 13 00:24:41.886820 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. May 13 00:24:41.886827 kernel: Trampoline variant of Tasks RCU enabled. May 13 00:24:41.886837 kernel: Rude variant of Tasks RCU enabled. May 13 00:24:41.886844 kernel: Tracing variant of Tasks RCU enabled. May 13 00:24:41.886851 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 13 00:24:41.886858 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 May 13 00:24:41.886865 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 May 13 00:24:41.886872 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 13 00:24:41.886879 kernel: Console: colour VGA+ 80x25 May 13 00:24:41.886886 kernel: printk: console [ttyS0] enabled May 13 00:24:41.886893 kernel: ACPI: Core revision 20230628 May 13 00:24:41.886903 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 13 00:24:41.886910 kernel: APIC: Switch to symmetric I/O mode setup May 13 00:24:41.886917 kernel: x2apic enabled May 13 00:24:41.886924 kernel: APIC: Switched APIC routing to: physical x2apic May 13 00:24:41.886931 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() May 13 00:24:41.886938 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() May 13 00:24:41.886946 kernel: kvm-guest: setup PV IPIs May 13 00:24:41.886962 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 13 00:24:41.886969 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized May 13 00:24:41.886977 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) May 13 00:24:41.886984 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 13 00:24:41.886992 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 May 13 00:24:41.887001 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 May 13 00:24:41.887009 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 13 00:24:41.887016 kernel: Spectre V2 : Mitigation: Retpolines May 13 00:24:41.887024 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 13 00:24:41.887031 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls May 13 00:24:41.887041 kernel: RETBleed: Mitigation: untrained return thunk May 13 00:24:41.887056 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 13 00:24:41.887063 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 13 00:24:41.887071 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! May 13 00:24:41.887079 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. May 13 00:24:41.887086 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode May 13 00:24:41.887094 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 13 00:24:41.887101 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 13 00:24:41.887111 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 13 00:24:41.887119 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 13 00:24:41.887126 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 13 00:24:41.887134 kernel: Freeing SMP alternatives memory: 32K May 13 00:24:41.887141 kernel: pid_max: default: 32768 minimum: 301 May 13 00:24:41.887149 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 13 00:24:41.887156 kernel: landlock: Up and running. May 13 00:24:41.887163 kernel: SELinux: Initializing. May 13 00:24:41.887171 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 00:24:41.887181 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 00:24:41.887189 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) May 13 00:24:41.887196 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 00:24:41.887204 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 00:24:41.887212 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 00:24:41.887219 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. May 13 00:24:41.887226 kernel: ... version: 0 May 13 00:24:41.887234 kernel: ... bit width: 48 May 13 00:24:41.887241 kernel: ... generic registers: 6 May 13 00:24:41.887251 kernel: ... value mask: 0000ffffffffffff May 13 00:24:41.887259 kernel: ... max period: 00007fffffffffff May 13 00:24:41.887266 kernel: ... fixed-purpose events: 0 May 13 00:24:41.887295 kernel: ... event mask: 000000000000003f May 13 00:24:41.887303 kernel: signal: max sigframe size: 1776 May 13 00:24:41.887310 kernel: rcu: Hierarchical SRCU implementation. May 13 00:24:41.887318 kernel: rcu: Max phase no-delay instances is 400. May 13 00:24:41.887325 kernel: smp: Bringing up secondary CPUs ... May 13 00:24:41.887333 kernel: smpboot: x86: Booting SMP configuration: May 13 00:24:41.887343 kernel: .... node #0, CPUs: #1 #2 #3 May 13 00:24:41.887350 kernel: smp: Brought up 1 node, 4 CPUs May 13 00:24:41.887357 kernel: smpboot: Max logical packages: 1 May 13 00:24:41.887365 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) May 13 00:24:41.887372 kernel: devtmpfs: initialized May 13 00:24:41.887380 kernel: x86/mm: Memory block size: 128MB May 13 00:24:41.887387 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 13 00:24:41.887395 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) May 13 00:24:41.887402 kernel: pinctrl core: initialized pinctrl subsystem May 13 00:24:41.887412 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 13 00:24:41.887419 kernel: audit: initializing netlink subsys (disabled) May 13 00:24:41.887427 kernel: audit: type=2000 audit(1747095881.427:1): state=initialized audit_enabled=0 res=1 May 13 00:24:41.887434 kernel: thermal_sys: Registered thermal governor 'step_wise' May 13 00:24:41.887442 kernel: thermal_sys: Registered thermal governor 'user_space' May 13 00:24:41.887449 kernel: cpuidle: using governor menu May 13 00:24:41.887457 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 13 00:24:41.887464 kernel: dca service started, version 1.12.1 May 13 00:24:41.887472 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) May 13 00:24:41.887481 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry May 13 00:24:41.887489 kernel: PCI: Using configuration type 1 for base access May 13 00:24:41.887496 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 13 00:24:41.887504 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 13 00:24:41.887511 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 13 00:24:41.887519 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 13 00:24:41.887526 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 13 00:24:41.887534 kernel: ACPI: Added _OSI(Module Device) May 13 00:24:41.887541 kernel: ACPI: Added _OSI(Processor Device) May 13 00:24:41.887551 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 13 00:24:41.887558 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 13 00:24:41.887566 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 13 00:24:41.887573 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 13 00:24:41.887581 kernel: ACPI: Interpreter enabled May 13 00:24:41.887588 kernel: ACPI: PM: (supports S0 S3 S5) May 13 00:24:41.887596 kernel: ACPI: Using IOAPIC for interrupt routing May 13 00:24:41.887603 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 13 00:24:41.887611 kernel: PCI: Using E820 reservations for host bridge windows May 13 00:24:41.887620 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 13 00:24:41.887628 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 13 00:24:41.887859 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 13 00:24:41.887990 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] May 13 00:24:41.888120 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] May 13 00:24:41.888145 kernel: PCI host bridge to bus 0000:00 May 13 00:24:41.888320 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 13 00:24:41.888440 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 13 00:24:41.888551 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 13 00:24:41.888661 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] May 13 00:24:41.888771 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 13 00:24:41.888883 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] May 13 00:24:41.888994 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 13 00:24:41.889140 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 May 13 00:24:41.889292 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 May 13 00:24:41.889453 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfd000000-0xfdffffff pref] May 13 00:24:41.889617 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xfebd0000-0xfebd0fff] May 13 00:24:41.889779 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfebc0000-0xfebcffff pref] May 13 00:24:41.889920 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 13 00:24:41.890083 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 May 13 00:24:41.890215 kernel: pci 0000:00:02.0: reg 0x10: [io 0xc0c0-0xc0df] May 13 00:24:41.890354 kernel: pci 0000:00:02.0: reg 0x14: [mem 0xfebd1000-0xfebd1fff] May 13 00:24:41.890479 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfe000000-0xfe003fff 64bit pref] May 13 00:24:41.890608 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 May 13 00:24:41.890730 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc000-0xc07f] May 13 00:24:41.890851 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfebd2000-0xfebd2fff] May 13 00:24:41.891008 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe004000-0xfe007fff 64bit pref] May 13 00:24:41.891173 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 May 13 00:24:41.891322 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc0e0-0xc0ff] May 13 00:24:41.891446 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfebd3000-0xfebd3fff] May 13 00:24:41.891570 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe008000-0xfe00bfff 64bit pref] May 13 00:24:41.891693 kernel: pci 0000:00:04.0: reg 0x30: [mem 0xfeb80000-0xfebbffff pref] May 13 00:24:41.891835 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 May 13 00:24:41.891980 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 13 00:24:41.892123 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 May 13 00:24:41.892243 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc100-0xc11f] May 13 00:24:41.892415 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfebd4000-0xfebd4fff] May 13 00:24:41.892545 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 May 13 00:24:41.892669 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] May 13 00:24:41.892680 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 13 00:24:41.892688 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 13 00:24:41.892700 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 13 00:24:41.892708 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 13 00:24:41.892716 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 13 00:24:41.892723 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 13 00:24:41.892731 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 13 00:24:41.892738 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 13 00:24:41.892746 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 13 00:24:41.892754 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 13 00:24:41.892761 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 13 00:24:41.892771 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 13 00:24:41.892779 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 13 00:24:41.892787 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 13 00:24:41.892794 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 13 00:24:41.892802 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 13 00:24:41.892810 kernel: iommu: Default domain type: Translated May 13 00:24:41.892818 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 13 00:24:41.892825 kernel: PCI: Using ACPI for IRQ routing May 13 00:24:41.892833 kernel: PCI: pci_cache_line_size set to 64 bytes May 13 00:24:41.892842 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 13 00:24:41.892850 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] May 13 00:24:41.892969 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 13 00:24:41.893096 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 13 00:24:41.893215 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 13 00:24:41.893225 kernel: vgaarb: loaded May 13 00:24:41.893233 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 13 00:24:41.893241 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 13 00:24:41.893252 kernel: clocksource: Switched to clocksource kvm-clock May 13 00:24:41.893260 kernel: VFS: Disk quotas dquot_6.6.0 May 13 00:24:41.893268 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 13 00:24:41.893288 kernel: pnp: PnP ACPI init May 13 00:24:41.893432 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved May 13 00:24:41.893443 kernel: pnp: PnP ACPI: found 6 devices May 13 00:24:41.893451 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 13 00:24:41.893459 kernel: NET: Registered PF_INET protocol family May 13 00:24:41.893471 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 13 00:24:41.893479 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 13 00:24:41.893487 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 13 00:24:41.893494 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 13 00:24:41.893502 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 13 00:24:41.893510 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 13 00:24:41.893517 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 00:24:41.893525 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 00:24:41.893533 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 13 00:24:41.893543 kernel: NET: Registered PF_XDP protocol family May 13 00:24:41.893654 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 13 00:24:41.893764 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 13 00:24:41.893872 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 13 00:24:41.893982 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] May 13 00:24:41.894101 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] May 13 00:24:41.894210 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] May 13 00:24:41.894221 kernel: PCI: CLS 0 bytes, default 64 May 13 00:24:41.894232 kernel: Initialise system trusted keyrings May 13 00:24:41.894240 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 13 00:24:41.894248 kernel: Key type asymmetric registered May 13 00:24:41.894255 kernel: Asymmetric key parser 'x509' registered May 13 00:24:41.894263 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) May 13 00:24:41.894270 kernel: io scheduler mq-deadline registered May 13 00:24:41.894339 kernel: io scheduler kyber registered May 13 00:24:41.894346 kernel: io scheduler bfq registered May 13 00:24:41.894354 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 13 00:24:41.894362 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 13 00:24:41.894374 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 13 00:24:41.894382 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 May 13 00:24:41.894389 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 13 00:24:41.894397 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 13 00:24:41.894404 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 13 00:24:41.894412 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 13 00:24:41.894419 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 13 00:24:41.894548 kernel: rtc_cmos 00:04: RTC can wake from S4 May 13 00:24:41.894562 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 13 00:24:41.894675 kernel: rtc_cmos 00:04: registered as rtc0 May 13 00:24:41.894788 kernel: rtc_cmos 00:04: setting system clock to 2025-05-13T00:24:41 UTC (1747095881) May 13 00:24:41.894901 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs May 13 00:24:41.894911 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 13 00:24:41.894919 kernel: NET: Registered PF_INET6 protocol family May 13 00:24:41.894926 kernel: Segment Routing with IPv6 May 13 00:24:41.894934 kernel: In-situ OAM (IOAM) with IPv6 May 13 00:24:41.894945 kernel: NET: Registered PF_PACKET protocol family May 13 00:24:41.894952 kernel: Key type dns_resolver registered May 13 00:24:41.894959 kernel: IPI shorthand broadcast: enabled May 13 00:24:41.894967 kernel: sched_clock: Marking stable (598002984, 104670205)->(721215323, -18542134) May 13 00:24:41.894975 kernel: registered taskstats version 1 May 13 00:24:41.894982 kernel: Loading compiled-in X.509 certificates May 13 00:24:41.894990 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: b404fdaaed18d29adfca671c3bbb23eee96fb08f' May 13 00:24:41.894997 kernel: Key type .fscrypt registered May 13 00:24:41.895004 kernel: Key type fscrypt-provisioning registered May 13 00:24:41.895014 kernel: ima: No TPM chip found, activating TPM-bypass! May 13 00:24:41.895021 kernel: ima: Allocated hash algorithm: sha1 May 13 00:24:41.895029 kernel: ima: No architecture policies found May 13 00:24:41.895036 kernel: clk: Disabling unused clocks May 13 00:24:41.895044 kernel: Freeing unused kernel image (initmem) memory: 42864K May 13 00:24:41.895058 kernel: Write protecting the kernel read-only data: 36864k May 13 00:24:41.895066 kernel: Freeing unused kernel image (rodata/data gap) memory: 1836K May 13 00:24:41.895073 kernel: Run /init as init process May 13 00:24:41.895081 kernel: with arguments: May 13 00:24:41.895091 kernel: /init May 13 00:24:41.895098 kernel: with environment: May 13 00:24:41.895105 kernel: HOME=/ May 13 00:24:41.895112 kernel: TERM=linux May 13 00:24:41.895119 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 13 00:24:41.895129 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) May 13 00:24:41.895138 systemd[1]: Detected virtualization kvm. May 13 00:24:41.895147 systemd[1]: Detected architecture x86-64. May 13 00:24:41.895157 systemd[1]: Running in initrd. May 13 00:24:41.895164 systemd[1]: No hostname configured, using default hostname. May 13 00:24:41.895172 systemd[1]: Hostname set to . May 13 00:24:41.895180 systemd[1]: Initializing machine ID from VM UUID. May 13 00:24:41.895188 systemd[1]: Queued start job for default target initrd.target. May 13 00:24:41.895196 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 00:24:41.895204 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 00:24:41.895213 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 13 00:24:41.895224 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 00:24:41.895243 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 13 00:24:41.895253 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 13 00:24:41.895263 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 13 00:24:41.895286 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 13 00:24:41.895294 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 00:24:41.895303 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 00:24:41.895311 systemd[1]: Reached target paths.target - Path Units. May 13 00:24:41.895319 systemd[1]: Reached target slices.target - Slice Units. May 13 00:24:41.895328 systemd[1]: Reached target swap.target - Swaps. May 13 00:24:41.895336 systemd[1]: Reached target timers.target - Timer Units. May 13 00:24:41.895344 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 13 00:24:41.895352 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 00:24:41.895363 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 13 00:24:41.895371 systemd[1]: Listening on systemd-journald.socket - Journal Socket. May 13 00:24:41.895379 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 00:24:41.895388 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 00:24:41.895396 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 00:24:41.895404 systemd[1]: Reached target sockets.target - Socket Units. May 13 00:24:41.895412 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 13 00:24:41.895421 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 00:24:41.895429 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 13 00:24:41.895440 systemd[1]: Starting systemd-fsck-usr.service... May 13 00:24:41.895448 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 00:24:41.895456 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 00:24:41.895464 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 00:24:41.895474 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 13 00:24:41.895483 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 00:24:41.895491 systemd[1]: Finished systemd-fsck-usr.service. May 13 00:24:41.895519 systemd-journald[192]: Collecting audit messages is disabled. May 13 00:24:41.895540 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 00:24:41.895549 systemd-journald[192]: Journal started May 13 00:24:41.895570 systemd-journald[192]: Runtime Journal (/run/log/journal/496fa4f863224b2194d29077cf68b0d1) is 6.0M, max 48.4M, 42.3M free. May 13 00:24:41.889862 systemd-modules-load[193]: Inserted module 'overlay' May 13 00:24:41.932221 systemd[1]: Started systemd-journald.service - Journal Service. May 13 00:24:41.932237 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 13 00:24:41.932248 kernel: Bridge firewalling registered May 13 00:24:41.916255 systemd-modules-load[193]: Inserted module 'br_netfilter' May 13 00:24:41.933360 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 00:24:41.935732 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 00:24:41.938185 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 00:24:41.949464 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 00:24:41.952519 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 00:24:41.955119 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 00:24:41.960067 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 00:24:41.965519 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 00:24:41.968648 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 00:24:41.970335 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 00:24:41.971569 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 13 00:24:41.980891 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 00:24:41.985228 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 00:24:41.987771 dracut-cmdline[225]: dracut-dracut-053 May 13 00:24:41.989945 dracut-cmdline[225]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=a30636f72ddb6c7dc7c9bee07b7cf23b403029ba1ff64eed2705530c62c7b592 May 13 00:24:42.025585 systemd-resolved[233]: Positive Trust Anchors: May 13 00:24:42.025601 systemd-resolved[233]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 00:24:42.025632 systemd-resolved[233]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 00:24:42.028145 systemd-resolved[233]: Defaulting to hostname 'linux'. May 13 00:24:42.029183 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 00:24:42.034952 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 00:24:42.099307 kernel: SCSI subsystem initialized May 13 00:24:42.108312 kernel: Loading iSCSI transport class v2.0-870. May 13 00:24:42.119334 kernel: iscsi: registered transport (tcp) May 13 00:24:42.141427 kernel: iscsi: registered transport (qla4xxx) May 13 00:24:42.141504 kernel: QLogic iSCSI HBA Driver May 13 00:24:42.190945 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 13 00:24:42.209414 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 13 00:24:42.234308 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 13 00:24:42.234344 kernel: device-mapper: uevent: version 1.0.3 May 13 00:24:42.235872 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 13 00:24:42.277304 kernel: raid6: avx2x4 gen() 30143 MB/s May 13 00:24:42.294301 kernel: raid6: avx2x2 gen() 31510 MB/s May 13 00:24:42.311381 kernel: raid6: avx2x1 gen() 25891 MB/s May 13 00:24:42.311404 kernel: raid6: using algorithm avx2x2 gen() 31510 MB/s May 13 00:24:42.329450 kernel: raid6: .... xor() 19705 MB/s, rmw enabled May 13 00:24:42.329471 kernel: raid6: using avx2x2 recovery algorithm May 13 00:24:42.349298 kernel: xor: automatically using best checksumming function avx May 13 00:24:42.504315 kernel: Btrfs loaded, zoned=no, fsverity=no May 13 00:24:42.516321 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 13 00:24:42.531425 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 00:24:42.543345 systemd-udevd[412]: Using default interface naming scheme 'v255'. May 13 00:24:42.547837 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 00:24:42.561467 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 13 00:24:42.574821 dracut-pre-trigger[421]: rd.md=0: removing MD RAID activation May 13 00:24:42.606431 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 13 00:24:42.619467 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 00:24:42.680204 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 00:24:42.688443 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 13 00:24:42.701968 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 13 00:24:42.703078 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 13 00:24:42.706886 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 00:24:42.708137 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 00:24:42.718964 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues May 13 00:24:42.718455 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 13 00:24:42.727846 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) May 13 00:24:42.728039 kernel: cryptd: max_cpu_qlen set to 1000 May 13 00:24:42.730601 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 13 00:24:42.735570 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 00:24:42.749823 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 13 00:24:42.749859 kernel: GPT:9289727 != 19775487 May 13 00:24:42.749877 kernel: GPT:Alternate GPT header not at the end of the disk. May 13 00:24:42.749907 kernel: GPT:9289727 != 19775487 May 13 00:24:42.749925 kernel: GPT: Use GNU Parted to correct GPT errors. May 13 00:24:42.749943 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 00:24:42.735698 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 00:24:42.753778 kernel: AVX2 version of gcm_enc/dec engaged. May 13 00:24:42.753794 kernel: libata version 3.00 loaded. May 13 00:24:42.753809 kernel: AES CTR mode by8 optimization enabled May 13 00:24:42.737101 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 00:24:42.739926 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 00:24:42.740051 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 00:24:42.745336 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 00:24:42.756608 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 00:24:42.763329 kernel: ahci 0000:00:1f.2: version 3.0 May 13 00:24:42.763565 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 13 00:24:42.766388 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode May 13 00:24:42.766580 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 13 00:24:42.771293 kernel: scsi host0: ahci May 13 00:24:42.773434 kernel: scsi host1: ahci May 13 00:24:42.774305 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 scanned by (udev-worker) (458) May 13 00:24:42.776432 kernel: scsi host2: ahci May 13 00:24:42.777574 kernel: BTRFS: device fsid b9c18834-b687-45d3-9868-9ac29dc7ddd7 devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (473) May 13 00:24:42.785306 kernel: scsi host3: ahci May 13 00:24:42.785883 kernel: scsi host4: ahci May 13 00:24:42.787883 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 13 00:24:42.818607 kernel: scsi host5: ahci May 13 00:24:42.818776 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 34 May 13 00:24:42.818788 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 34 May 13 00:24:42.818798 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 34 May 13 00:24:42.818814 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 34 May 13 00:24:42.818823 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 34 May 13 00:24:42.818836 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 34 May 13 00:24:42.822544 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 13 00:24:42.825258 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 00:24:42.837856 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 00:24:42.842060 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 13 00:24:42.842502 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 13 00:24:42.857404 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 13 00:24:42.858481 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 00:24:42.870966 disk-uuid[566]: Primary Header is updated. May 13 00:24:42.870966 disk-uuid[566]: Secondary Entries is updated. May 13 00:24:42.870966 disk-uuid[566]: Secondary Header is updated. May 13 00:24:42.877315 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 00:24:42.878964 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 00:24:42.883298 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 00:24:43.107300 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) May 13 00:24:43.107376 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 13 00:24:43.107387 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 13 00:24:43.108322 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 13 00:24:43.108398 kernel: ata1: SATA link down (SStatus 0 SControl 300) May 13 00:24:43.109318 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 13 00:24:43.110313 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 May 13 00:24:43.111591 kernel: ata3.00: applying bridge limits May 13 00:24:43.111624 kernel: ata3.00: configured for UDMA/100 May 13 00:24:43.112309 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 May 13 00:24:43.160315 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray May 13 00:24:43.160534 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 13 00:24:43.174307 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 May 13 00:24:43.883224 disk-uuid[571]: The operation has completed successfully. May 13 00:24:43.884537 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 00:24:43.912061 systemd[1]: disk-uuid.service: Deactivated successfully. May 13 00:24:43.912186 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 13 00:24:43.932422 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 13 00:24:43.938335 sh[591]: Success May 13 00:24:43.951326 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" May 13 00:24:43.984862 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 13 00:24:43.994741 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 13 00:24:43.998841 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 13 00:24:44.009968 kernel: BTRFS info (device dm-0): first mount of filesystem b9c18834-b687-45d3-9868-9ac29dc7ddd7 May 13 00:24:44.010015 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 13 00:24:44.010027 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 13 00:24:44.011017 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 13 00:24:44.011764 kernel: BTRFS info (device dm-0): using free space tree May 13 00:24:44.016545 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 13 00:24:44.018957 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 13 00:24:44.035492 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 13 00:24:44.037218 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 13 00:24:44.045383 kernel: BTRFS info (device vda6): first mount of filesystem 97fe19c2-c075-4d7e-9417-f9c367b49e5c May 13 00:24:44.045411 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 00:24:44.045422 kernel: BTRFS info (device vda6): using free space tree May 13 00:24:44.049309 kernel: BTRFS info (device vda6): auto enabling async discard May 13 00:24:44.058180 systemd[1]: mnt-oem.mount: Deactivated successfully. May 13 00:24:44.059845 kernel: BTRFS info (device vda6): last unmount of filesystem 97fe19c2-c075-4d7e-9417-f9c367b49e5c May 13 00:24:44.069738 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 13 00:24:44.076504 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 13 00:24:44.129601 ignition[685]: Ignition 2.19.0 May 13 00:24:44.130184 ignition[685]: Stage: fetch-offline May 13 00:24:44.130234 ignition[685]: no configs at "/usr/lib/ignition/base.d" May 13 00:24:44.130244 ignition[685]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 00:24:44.130363 ignition[685]: parsed url from cmdline: "" May 13 00:24:44.130367 ignition[685]: no config URL provided May 13 00:24:44.130373 ignition[685]: reading system config file "/usr/lib/ignition/user.ign" May 13 00:24:44.130383 ignition[685]: no config at "/usr/lib/ignition/user.ign" May 13 00:24:44.130415 ignition[685]: op(1): [started] loading QEMU firmware config module May 13 00:24:44.130420 ignition[685]: op(1): executing: "modprobe" "qemu_fw_cfg" May 13 00:24:44.139698 ignition[685]: op(1): [finished] loading QEMU firmware config module May 13 00:24:44.158547 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 00:24:44.170450 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 00:24:44.182810 ignition[685]: parsing config with SHA512: cd432e99eb27c28f8cea4771642b41ae4d378fee90c94c3a69cb5c6a0cda9b0ff3f029968aced8d8633d0da5a78379c220ac5c90c954e30a3a696948fc2a2a5a May 13 00:24:44.186326 unknown[685]: fetched base config from "system" May 13 00:24:44.186340 unknown[685]: fetched user config from "qemu" May 13 00:24:44.186649 ignition[685]: fetch-offline: fetch-offline passed May 13 00:24:44.189018 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 13 00:24:44.186704 ignition[685]: Ignition finished successfully May 13 00:24:44.198156 systemd-networkd[779]: lo: Link UP May 13 00:24:44.198167 systemd-networkd[779]: lo: Gained carrier May 13 00:24:44.199809 systemd-networkd[779]: Enumeration completed May 13 00:24:44.199889 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 00:24:44.200212 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 00:24:44.200216 systemd-networkd[779]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 00:24:44.201616 systemd-networkd[779]: eth0: Link UP May 13 00:24:44.201620 systemd-networkd[779]: eth0: Gained carrier May 13 00:24:44.201628 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 00:24:44.202202 systemd[1]: Reached target network.target - Network. May 13 00:24:44.204012 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 13 00:24:44.213397 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 13 00:24:44.218336 systemd-networkd[779]: eth0: DHCPv4 address 10.0.0.89/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 00:24:44.225165 ignition[782]: Ignition 2.19.0 May 13 00:24:44.225175 ignition[782]: Stage: kargs May 13 00:24:44.225410 ignition[782]: no configs at "/usr/lib/ignition/base.d" May 13 00:24:44.225421 ignition[782]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 00:24:44.226168 ignition[782]: kargs: kargs passed May 13 00:24:44.226210 ignition[782]: Ignition finished successfully May 13 00:24:44.230185 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 13 00:24:44.240422 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 13 00:24:44.252191 ignition[791]: Ignition 2.19.0 May 13 00:24:44.252202 ignition[791]: Stage: disks May 13 00:24:44.252379 ignition[791]: no configs at "/usr/lib/ignition/base.d" May 13 00:24:44.252391 ignition[791]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 00:24:44.253150 ignition[791]: disks: disks passed May 13 00:24:44.253193 ignition[791]: Ignition finished successfully May 13 00:24:44.258915 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 13 00:24:44.261101 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 13 00:24:44.261581 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 13 00:24:44.261905 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 00:24:44.262266 systemd[1]: Reached target sysinit.target - System Initialization. May 13 00:24:44.262808 systemd[1]: Reached target basic.target - Basic System. May 13 00:24:44.280441 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 13 00:24:44.296853 systemd-fsck[802]: ROOT: clean, 14/553520 files, 52654/553472 blocks May 13 00:24:44.306396 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 13 00:24:44.316422 systemd[1]: Mounting sysroot.mount - /sysroot... May 13 00:24:44.400316 kernel: EXT4-fs (vda9): mounted filesystem 422ad498-4f61-405b-9d71-25f19459d196 r/w with ordered data mode. Quota mode: none. May 13 00:24:44.400358 systemd[1]: Mounted sysroot.mount - /sysroot. May 13 00:24:44.401384 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 13 00:24:44.409370 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 00:24:44.411185 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 13 00:24:44.411954 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 13 00:24:44.411996 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 13 00:24:44.419557 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/vda6 scanned by mount (810) May 13 00:24:44.412017 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 13 00:24:44.423460 kernel: BTRFS info (device vda6): first mount of filesystem 97fe19c2-c075-4d7e-9417-f9c367b49e5c May 13 00:24:44.423475 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 00:24:44.423486 kernel: BTRFS info (device vda6): using free space tree May 13 00:24:44.425290 kernel: BTRFS info (device vda6): auto enabling async discard May 13 00:24:44.426874 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 00:24:44.433556 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 13 00:24:44.434852 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 13 00:24:44.470256 initrd-setup-root[834]: cut: /sysroot/etc/passwd: No such file or directory May 13 00:24:44.475478 initrd-setup-root[841]: cut: /sysroot/etc/group: No such file or directory May 13 00:24:44.479932 initrd-setup-root[848]: cut: /sysroot/etc/shadow: No such file or directory May 13 00:24:44.483510 initrd-setup-root[855]: cut: /sysroot/etc/gshadow: No such file or directory May 13 00:24:44.560334 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 13 00:24:44.572411 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 13 00:24:44.574213 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 13 00:24:44.582306 kernel: BTRFS info (device vda6): last unmount of filesystem 97fe19c2-c075-4d7e-9417-f9c367b49e5c May 13 00:24:44.597165 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 13 00:24:44.603533 ignition[925]: INFO : Ignition 2.19.0 May 13 00:24:44.603533 ignition[925]: INFO : Stage: mount May 13 00:24:44.605213 ignition[925]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 00:24:44.605213 ignition[925]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 00:24:44.605213 ignition[925]: INFO : mount: mount passed May 13 00:24:44.605213 ignition[925]: INFO : Ignition finished successfully May 13 00:24:44.610759 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 13 00:24:44.615483 systemd[1]: Starting ignition-files.service - Ignition (files)... May 13 00:24:45.009393 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 13 00:24:45.022408 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 00:24:45.030325 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by mount (937) May 13 00:24:45.030379 kernel: BTRFS info (device vda6): first mount of filesystem 97fe19c2-c075-4d7e-9417-f9c367b49e5c May 13 00:24:45.030390 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 00:24:45.032305 kernel: BTRFS info (device vda6): using free space tree May 13 00:24:45.034308 kernel: BTRFS info (device vda6): auto enabling async discard May 13 00:24:45.035938 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 00:24:45.055724 ignition[954]: INFO : Ignition 2.19.0 May 13 00:24:45.055724 ignition[954]: INFO : Stage: files May 13 00:24:45.057373 ignition[954]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 00:24:45.057373 ignition[954]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 00:24:45.057373 ignition[954]: DEBUG : files: compiled without relabeling support, skipping May 13 00:24:45.060853 ignition[954]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 13 00:24:45.060853 ignition[954]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 13 00:24:45.060853 ignition[954]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 13 00:24:45.064700 ignition[954]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 13 00:24:45.066104 ignition[954]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 13 00:24:45.065132 unknown[954]: wrote ssh authorized keys file for user: core May 13 00:24:45.068460 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 13 00:24:45.068460 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 May 13 00:24:45.115808 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 13 00:24:45.298970 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 13 00:24:45.298970 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 13 00:24:45.302885 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 May 13 00:24:45.758568 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 13 00:24:46.099445 systemd-networkd[779]: eth0: Gained IPv6LL May 13 00:24:46.147431 ignition[954]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 13 00:24:46.147431 ignition[954]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 13 00:24:46.151217 ignition[954]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 00:24:46.153378 ignition[954]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 00:24:46.153378 ignition[954]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 13 00:24:46.153378 ignition[954]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 13 00:24:46.157769 ignition[954]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 13 00:24:46.159653 ignition[954]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 13 00:24:46.159653 ignition[954]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 13 00:24:46.162848 ignition[954]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" May 13 00:24:46.184706 ignition[954]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" May 13 00:24:46.190211 ignition[954]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 13 00:24:46.191827 ignition[954]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" May 13 00:24:46.191827 ignition[954]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" May 13 00:24:46.191827 ignition[954]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" May 13 00:24:46.191827 ignition[954]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" May 13 00:24:46.191827 ignition[954]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" May 13 00:24:46.191827 ignition[954]: INFO : files: files passed May 13 00:24:46.191827 ignition[954]: INFO : Ignition finished successfully May 13 00:24:46.193545 systemd[1]: Finished ignition-files.service - Ignition (files). May 13 00:24:46.202424 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 13 00:24:46.204165 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 13 00:24:46.206736 systemd[1]: ignition-quench.service: Deactivated successfully. May 13 00:24:46.206869 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 13 00:24:46.214981 initrd-setup-root-after-ignition[983]: grep: /sysroot/oem/oem-release: No such file or directory May 13 00:24:46.217951 initrd-setup-root-after-ignition[985]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 00:24:46.217951 initrd-setup-root-after-ignition[985]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 13 00:24:46.222368 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 00:24:46.220081 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 00:24:46.222908 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 13 00:24:46.234427 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 13 00:24:46.258607 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 13 00:24:46.258735 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 13 00:24:46.261056 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 13 00:24:46.263358 systemd[1]: Reached target initrd.target - Initrd Default Target. May 13 00:24:46.264455 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 13 00:24:46.269404 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 13 00:24:46.284162 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 00:24:46.296453 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 13 00:24:46.308417 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 13 00:24:46.308960 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 00:24:46.309578 systemd[1]: Stopped target timers.target - Timer Units. May 13 00:24:46.313713 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 13 00:24:46.313822 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 00:24:46.316849 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 13 00:24:46.317320 systemd[1]: Stopped target basic.target - Basic System. May 13 00:24:46.317806 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 13 00:24:46.318149 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 13 00:24:46.318653 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 13 00:24:46.319000 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 13 00:24:46.319344 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 13 00:24:46.330534 systemd[1]: Stopped target sysinit.target - System Initialization. May 13 00:24:46.331088 systemd[1]: Stopped target local-fs.target - Local File Systems. May 13 00:24:46.331585 systemd[1]: Stopped target swap.target - Swaps. May 13 00:24:46.331893 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 13 00:24:46.332008 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 13 00:24:46.338677 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 13 00:24:46.340868 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 00:24:46.341656 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 13 00:24:46.341752 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 00:24:46.342071 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 13 00:24:46.342176 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 13 00:24:46.342980 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 13 00:24:46.343084 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 13 00:24:46.343591 systemd[1]: Stopped target paths.target - Path Units. May 13 00:24:46.343890 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 13 00:24:46.357348 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 00:24:46.358191 systemd[1]: Stopped target slices.target - Slice Units. May 13 00:24:46.358858 systemd[1]: Stopped target sockets.target - Socket Units. May 13 00:24:46.359203 systemd[1]: iscsid.socket: Deactivated successfully. May 13 00:24:46.359304 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 13 00:24:46.359737 systemd[1]: iscsiuio.socket: Deactivated successfully. May 13 00:24:46.359822 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 00:24:46.365625 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 13 00:24:46.365734 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 00:24:46.367306 systemd[1]: ignition-files.service: Deactivated successfully. May 13 00:24:46.367408 systemd[1]: Stopped ignition-files.service - Ignition (files). May 13 00:24:46.382413 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 13 00:24:46.382675 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 13 00:24:46.382785 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 13 00:24:46.385461 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 13 00:24:46.386754 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 13 00:24:46.386870 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 13 00:24:46.387179 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 13 00:24:46.387291 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 13 00:24:46.393736 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 13 00:24:46.393840 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 13 00:24:46.407663 ignition[1010]: INFO : Ignition 2.19.0 May 13 00:24:46.407663 ignition[1010]: INFO : Stage: umount May 13 00:24:46.409476 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 00:24:46.409476 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 00:24:46.412301 ignition[1010]: INFO : umount: umount passed May 13 00:24:46.413268 ignition[1010]: INFO : Ignition finished successfully May 13 00:24:46.413061 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 13 00:24:46.416062 systemd[1]: ignition-mount.service: Deactivated successfully. May 13 00:24:46.416210 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 13 00:24:46.418262 systemd[1]: Stopped target network.target - Network. May 13 00:24:46.419891 systemd[1]: ignition-disks.service: Deactivated successfully. May 13 00:24:46.419954 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 13 00:24:46.421852 systemd[1]: ignition-kargs.service: Deactivated successfully. May 13 00:24:46.421899 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 13 00:24:46.423742 systemd[1]: ignition-setup.service: Deactivated successfully. May 13 00:24:46.423788 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 13 00:24:46.426055 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 13 00:24:46.426105 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 13 00:24:46.428182 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 13 00:24:46.430264 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 13 00:24:46.433345 systemd-networkd[779]: eth0: DHCPv6 lease lost May 13 00:24:46.435164 systemd[1]: systemd-resolved.service: Deactivated successfully. May 13 00:24:46.435330 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 13 00:24:46.437789 systemd[1]: systemd-networkd.service: Deactivated successfully. May 13 00:24:46.437968 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 13 00:24:46.440912 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 13 00:24:46.441002 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 13 00:24:46.454448 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 13 00:24:46.455450 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 13 00:24:46.455522 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 00:24:46.457697 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 13 00:24:46.457744 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 13 00:24:46.459777 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 13 00:24:46.459823 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 13 00:24:46.462115 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 13 00:24:46.462166 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 00:24:46.464333 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 00:24:46.474515 systemd[1]: network-cleanup.service: Deactivated successfully. May 13 00:24:46.497152 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 13 00:24:46.510067 systemd[1]: systemd-udevd.service: Deactivated successfully. May 13 00:24:46.511147 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 00:24:46.513798 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 13 00:24:46.513856 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 13 00:24:46.517083 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 13 00:24:46.517135 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 13 00:24:46.517760 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 13 00:24:46.517810 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 13 00:24:46.518584 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 13 00:24:46.518630 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 13 00:24:46.523557 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 00:24:46.523605 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 00:24:46.539399 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 13 00:24:46.539645 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 13 00:24:46.539698 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 00:24:46.541797 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 00:24:46.541845 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 00:24:46.547034 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 13 00:24:46.547157 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 13 00:24:46.574581 systemd[1]: sysroot-boot.service: Deactivated successfully. May 13 00:24:46.574708 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 13 00:24:46.575466 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 13 00:24:46.575798 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 13 00:24:46.575843 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 13 00:24:46.582488 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 13 00:24:46.589140 systemd[1]: Switching root. May 13 00:24:46.623714 systemd-journald[192]: Journal stopped May 13 00:24:47.673895 systemd-journald[192]: Received SIGTERM from PID 1 (systemd). May 13 00:24:47.673967 kernel: SELinux: policy capability network_peer_controls=1 May 13 00:24:47.673986 kernel: SELinux: policy capability open_perms=1 May 13 00:24:47.673997 kernel: SELinux: policy capability extended_socket_class=1 May 13 00:24:47.674008 kernel: SELinux: policy capability always_check_network=0 May 13 00:24:47.674025 kernel: SELinux: policy capability cgroup_seclabel=1 May 13 00:24:47.674036 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 13 00:24:47.674059 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 13 00:24:47.674070 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 13 00:24:47.674081 kernel: audit: type=1403 audit(1747095886.969:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 13 00:24:47.674100 systemd[1]: Successfully loaded SELinux policy in 39.633ms. May 13 00:24:47.674114 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 12.397ms. May 13 00:24:47.674129 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) May 13 00:24:47.674141 systemd[1]: Detected virtualization kvm. May 13 00:24:47.674158 systemd[1]: Detected architecture x86-64. May 13 00:24:47.674170 systemd[1]: Detected first boot. May 13 00:24:47.674182 systemd[1]: Initializing machine ID from VM UUID. May 13 00:24:47.674194 zram_generator::config[1053]: No configuration found. May 13 00:24:47.674210 systemd[1]: Populated /etc with preset unit settings. May 13 00:24:47.674222 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 13 00:24:47.674233 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 13 00:24:47.674245 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 13 00:24:47.674258 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 13 00:24:47.674269 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 13 00:24:47.674294 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 13 00:24:47.674305 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 13 00:24:47.674317 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 13 00:24:47.674332 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 13 00:24:47.674345 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 13 00:24:47.674357 systemd[1]: Created slice user.slice - User and Session Slice. May 13 00:24:47.674369 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 00:24:47.674382 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 00:24:47.674394 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 13 00:24:47.674407 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 13 00:24:47.674419 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 13 00:24:47.674433 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 00:24:47.674445 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 13 00:24:47.674457 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 00:24:47.674469 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 13 00:24:47.674481 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 13 00:24:47.674493 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 13 00:24:47.674505 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 13 00:24:47.674516 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 00:24:47.674531 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 00:24:47.674543 systemd[1]: Reached target slices.target - Slice Units. May 13 00:24:47.674554 systemd[1]: Reached target swap.target - Swaps. May 13 00:24:47.674566 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 13 00:24:47.674578 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 13 00:24:47.674590 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 00:24:47.674602 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 00:24:47.674614 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 00:24:47.674626 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 13 00:24:47.674637 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 13 00:24:47.674652 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 13 00:24:47.674664 systemd[1]: Mounting media.mount - External Media Directory... May 13 00:24:47.674676 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 00:24:47.674688 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 13 00:24:47.674700 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 13 00:24:47.674712 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 13 00:24:47.674724 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 13 00:24:47.674736 systemd[1]: Reached target machines.target - Containers. May 13 00:24:47.674750 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 13 00:24:47.674762 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 00:24:47.674775 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 00:24:47.674787 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 13 00:24:47.674799 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 00:24:47.674811 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 00:24:47.674823 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 00:24:47.674835 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 13 00:24:47.674847 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 00:24:47.674862 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 13 00:24:47.674874 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 13 00:24:47.674892 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 13 00:24:47.674905 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 13 00:24:47.674916 systemd[1]: Stopped systemd-fsck-usr.service. May 13 00:24:47.674929 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 00:24:47.674941 kernel: fuse: init (API version 7.39) May 13 00:24:47.674953 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 00:24:47.674967 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 00:24:47.674979 kernel: loop: module loaded May 13 00:24:47.674990 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 13 00:24:47.675002 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 00:24:47.675014 systemd[1]: verity-setup.service: Deactivated successfully. May 13 00:24:47.675026 systemd[1]: Stopped verity-setup.service. May 13 00:24:47.675038 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 00:24:47.675050 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 13 00:24:47.675061 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 13 00:24:47.675075 systemd[1]: Mounted media.mount - External Media Directory. May 13 00:24:47.675104 systemd-journald[1123]: Collecting audit messages is disabled. May 13 00:24:47.675126 systemd-journald[1123]: Journal started May 13 00:24:47.675148 systemd-journald[1123]: Runtime Journal (/run/log/journal/496fa4f863224b2194d29077cf68b0d1) is 6.0M, max 48.4M, 42.3M free. May 13 00:24:47.696419 kernel: ACPI: bus type drm_connector registered May 13 00:24:47.461346 systemd[1]: Queued start job for default target multi-user.target. May 13 00:24:47.476014 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 13 00:24:47.476494 systemd[1]: systemd-journald.service: Deactivated successfully. May 13 00:24:47.698484 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 13 00:24:47.700535 systemd[1]: Started systemd-journald.service - Journal Service. May 13 00:24:47.701313 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 13 00:24:47.702561 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 13 00:24:47.703873 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 13 00:24:47.705376 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 00:24:47.706987 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 13 00:24:47.707166 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 13 00:24:47.708716 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 00:24:47.708904 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 00:24:47.710503 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 00:24:47.710679 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 00:24:47.712089 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 00:24:47.712260 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 00:24:47.713796 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 13 00:24:47.713976 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 13 00:24:47.715390 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 00:24:47.715568 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 00:24:47.717115 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 00:24:47.718557 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 00:24:47.720212 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 13 00:24:47.735975 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 00:24:47.746362 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 13 00:24:47.748762 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 13 00:24:47.750024 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 13 00:24:47.750109 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 00:24:47.752168 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). May 13 00:24:47.754515 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 13 00:24:47.758665 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 13 00:24:47.760138 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 00:24:47.762043 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 13 00:24:47.764944 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 13 00:24:47.766959 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 00:24:47.771387 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 13 00:24:47.772925 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 00:24:47.778005 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 00:24:47.780402 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 13 00:24:47.787433 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 13 00:24:47.790185 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 13 00:24:47.793748 systemd-journald[1123]: Time spent on flushing to /var/log/journal/496fa4f863224b2194d29077cf68b0d1 is 12.997ms for 950 entries. May 13 00:24:47.793748 systemd-journald[1123]: System Journal (/var/log/journal/496fa4f863224b2194d29077cf68b0d1) is 8.0M, max 195.6M, 187.6M free. May 13 00:24:47.820904 systemd-journald[1123]: Received client request to flush runtime journal. May 13 00:24:47.820937 kernel: loop0: detected capacity change from 0 to 140768 May 13 00:24:47.793460 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 13 00:24:47.796094 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 13 00:24:47.799819 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 00:24:47.803144 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 13 00:24:47.817649 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 13 00:24:47.830496 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... May 13 00:24:47.834305 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 13 00:24:47.835434 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 13 00:24:47.837814 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 13 00:24:47.840160 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 00:24:47.849785 udevadm[1181]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. May 13 00:24:47.850754 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 13 00:24:47.858615 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 00:24:47.860967 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 13 00:24:47.861713 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. May 13 00:24:47.885324 kernel: loop1: detected capacity change from 0 to 218376 May 13 00:24:47.903851 systemd-tmpfiles[1185]: ACLs are not supported, ignoring. May 13 00:24:47.903869 systemd-tmpfiles[1185]: ACLs are not supported, ignoring. May 13 00:24:47.909781 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 00:24:47.920309 kernel: loop2: detected capacity change from 0 to 142488 May 13 00:24:47.953359 kernel: loop3: detected capacity change from 0 to 140768 May 13 00:24:47.968296 kernel: loop4: detected capacity change from 0 to 218376 May 13 00:24:47.976474 kernel: loop5: detected capacity change from 0 to 142488 May 13 00:24:47.985826 (sd-merge)[1192]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. May 13 00:24:47.986689 (sd-merge)[1192]: Merged extensions into '/usr'. May 13 00:24:47.990208 systemd[1]: Reloading requested from client PID 1167 ('systemd-sysext') (unit systemd-sysext.service)... May 13 00:24:47.990222 systemd[1]: Reloading... May 13 00:24:48.038300 zram_generator::config[1221]: No configuration found. May 13 00:24:48.080947 ldconfig[1162]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 13 00:24:48.160741 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 00:24:48.209612 systemd[1]: Reloading finished in 218 ms. May 13 00:24:48.243196 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 13 00:24:48.244910 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 13 00:24:48.261446 systemd[1]: Starting ensure-sysext.service... May 13 00:24:48.263350 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 00:24:48.268974 systemd[1]: Reloading requested from client PID 1255 ('systemctl') (unit ensure-sysext.service)... May 13 00:24:48.268992 systemd[1]: Reloading... May 13 00:24:48.309090 systemd-tmpfiles[1256]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 13 00:24:48.311980 systemd-tmpfiles[1256]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 13 00:24:48.313202 systemd-tmpfiles[1256]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 13 00:24:48.313737 systemd-tmpfiles[1256]: ACLs are not supported, ignoring. May 13 00:24:48.313895 systemd-tmpfiles[1256]: ACLs are not supported, ignoring. May 13 00:24:48.321767 systemd-tmpfiles[1256]: Detected autofs mount point /boot during canonicalization of boot. May 13 00:24:48.321891 systemd-tmpfiles[1256]: Skipping /boot May 13 00:24:48.328641 zram_generator::config[1286]: No configuration found. May 13 00:24:48.333680 systemd-tmpfiles[1256]: Detected autofs mount point /boot during canonicalization of boot. May 13 00:24:48.333699 systemd-tmpfiles[1256]: Skipping /boot May 13 00:24:48.434311 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 00:24:48.484297 systemd[1]: Reloading finished in 214 ms. May 13 00:24:48.505138 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 13 00:24:48.519076 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 00:24:48.530088 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... May 13 00:24:48.532851 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 13 00:24:48.535699 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 13 00:24:48.540173 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 00:24:48.545566 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 00:24:48.551516 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 13 00:24:48.556307 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 00:24:48.556517 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 00:24:48.558185 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 00:24:48.560913 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 00:24:48.564361 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 00:24:48.565559 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 00:24:48.568534 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 13 00:24:48.569593 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 00:24:48.570503 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 00:24:48.570720 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 00:24:48.586742 systemd-udevd[1333]: Using default interface naming scheme 'v255'. May 13 00:24:48.597233 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 00:24:48.597512 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 00:24:48.600052 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 00:24:48.600857 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 00:24:48.602872 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 13 00:24:48.605118 augenrules[1347]: No rules May 13 00:24:48.606563 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. May 13 00:24:48.616900 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 13 00:24:48.621935 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 00:24:48.623766 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 00:24:48.624046 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 00:24:48.630567 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 00:24:48.636502 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 00:24:48.645605 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 00:24:48.651082 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 00:24:48.652390 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 00:24:48.656559 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 00:24:48.662369 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 13 00:24:48.663639 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 00:24:48.664817 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 13 00:24:48.667061 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 13 00:24:48.669225 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 00:24:48.669508 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 00:24:48.673468 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 00:24:48.673700 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 00:24:48.675664 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 00:24:48.675919 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 00:24:48.677984 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 00:24:48.678211 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 00:24:48.693058 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 13 00:24:48.695366 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1384) May 13 00:24:48.704991 systemd[1]: Finished ensure-sysext.service. May 13 00:24:48.711908 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 13 00:24:48.714566 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 00:24:48.714662 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 00:24:48.726434 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 13 00:24:48.727878 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 00:24:48.748390 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 May 13 00:24:48.754270 systemd-resolved[1327]: Positive Trust Anchors: May 13 00:24:48.754310 systemd-resolved[1327]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 00:24:48.754342 systemd-resolved[1327]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 00:24:48.756786 kernel: ACPI: button: Power Button [PWRF] May 13 00:24:48.762414 systemd-resolved[1327]: Defaulting to hostname 'linux'. May 13 00:24:48.766310 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 00:24:48.777566 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 13 00:24:48.780075 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 00:24:48.782900 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 00:24:48.787362 systemd-networkd[1385]: lo: Link UP May 13 00:24:48.787375 systemd-networkd[1385]: lo: Gained carrier May 13 00:24:48.789542 systemd-networkd[1385]: Enumeration completed May 13 00:24:48.789636 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 00:24:48.790870 systemd-networkd[1385]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 00:24:48.790882 systemd-networkd[1385]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 00:24:48.791538 systemd[1]: Reached target network.target - Network. May 13 00:24:48.791660 systemd-networkd[1385]: eth0: Link UP May 13 00:24:48.791671 systemd-networkd[1385]: eth0: Gained carrier May 13 00:24:48.791683 systemd-networkd[1385]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 00:24:48.798307 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 13 00:24:48.804518 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 13 00:24:48.807053 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 13 00:24:48.808073 systemd[1]: Reached target time-set.target - System Time Set. May 13 00:24:48.810084 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 13 00:24:48.811398 systemd-networkd[1385]: eth0: DHCPv4 address 10.0.0.89/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 00:24:48.812186 systemd-timesyncd[1400]: Network configuration changed, trying to establish connection. May 13 00:24:48.813024 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) May 13 00:24:49.363310 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 13 00:24:48.814314 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 13 00:24:49.350647 systemd-timesyncd[1400]: Contacted time server 10.0.0.1:123 (10.0.0.1). May 13 00:24:49.350686 systemd-timesyncd[1400]: Initial clock synchronization to Tue 2025-05-13 00:24:49.350554 UTC. May 13 00:24:49.352602 systemd-resolved[1327]: Clock change detected. Flushing caches. May 13 00:24:49.374555 kernel: mousedev: PS/2 mouse device common for all mice May 13 00:24:49.442938 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 00:24:49.455915 kernel: kvm_amd: TSC scaling supported May 13 00:24:49.455955 kernel: kvm_amd: Nested Virtualization enabled May 13 00:24:49.455968 kernel: kvm_amd: Nested Paging enabled May 13 00:24:49.456918 kernel: kvm_amd: LBR virtualization supported May 13 00:24:49.456934 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported May 13 00:24:49.457942 kernel: kvm_amd: Virtual GIF supported May 13 00:24:49.477577 kernel: EDAC MC: Ver: 3.0.0 May 13 00:24:49.515013 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 13 00:24:49.549829 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 13 00:24:49.551459 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 00:24:49.558698 lvm[1420]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 00:24:49.598844 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 13 00:24:49.600419 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 00:24:49.601571 systemd[1]: Reached target sysinit.target - System Initialization. May 13 00:24:49.602740 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 13 00:24:49.604004 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 13 00:24:49.605447 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 13 00:24:49.606674 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 13 00:24:49.607930 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 13 00:24:49.609184 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 13 00:24:49.609211 systemd[1]: Reached target paths.target - Path Units. May 13 00:24:49.610105 systemd[1]: Reached target timers.target - Timer Units. May 13 00:24:49.611568 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 13 00:24:49.614224 systemd[1]: Starting docker.socket - Docker Socket for the API... May 13 00:24:49.623890 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 13 00:24:49.626179 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 13 00:24:49.627782 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 13 00:24:49.628952 systemd[1]: Reached target sockets.target - Socket Units. May 13 00:24:49.629923 systemd[1]: Reached target basic.target - Basic System. May 13 00:24:49.630905 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 13 00:24:49.630931 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 13 00:24:49.631843 systemd[1]: Starting containerd.service - containerd container runtime... May 13 00:24:49.633913 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 13 00:24:49.638583 lvm[1425]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 00:24:49.638622 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 13 00:24:49.642788 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 13 00:24:49.643913 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 13 00:24:49.644439 jq[1428]: false May 13 00:24:49.646518 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 13 00:24:49.649584 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 13 00:24:49.655782 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 13 00:24:49.659338 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 13 00:24:49.661817 extend-filesystems[1429]: Found loop3 May 13 00:24:49.663411 extend-filesystems[1429]: Found loop4 May 13 00:24:49.663411 extend-filesystems[1429]: Found loop5 May 13 00:24:49.663411 extend-filesystems[1429]: Found sr0 May 13 00:24:49.663411 extend-filesystems[1429]: Found vda May 13 00:24:49.663411 extend-filesystems[1429]: Found vda1 May 13 00:24:49.663411 extend-filesystems[1429]: Found vda2 May 13 00:24:49.663411 extend-filesystems[1429]: Found vda3 May 13 00:24:49.663411 extend-filesystems[1429]: Found usr May 13 00:24:49.663411 extend-filesystems[1429]: Found vda4 May 13 00:24:49.663411 extend-filesystems[1429]: Found vda6 May 13 00:24:49.663411 extend-filesystems[1429]: Found vda7 May 13 00:24:49.663411 extend-filesystems[1429]: Found vda9 May 13 00:24:49.663411 extend-filesystems[1429]: Checking size of /dev/vda9 May 13 00:24:49.673262 dbus-daemon[1427]: [system] SELinux support is enabled May 13 00:24:49.666622 systemd[1]: Starting systemd-logind.service - User Login Management... May 13 00:24:49.669170 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 13 00:24:49.669657 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 13 00:24:49.674763 systemd[1]: Starting update-engine.service - Update Engine... May 13 00:24:49.685635 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 13 00:24:49.689416 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 13 00:24:49.690811 extend-filesystems[1429]: Resized partition /dev/vda9 May 13 00:24:49.693317 update_engine[1439]: I20250513 00:24:49.693248 1439 main.cc:92] Flatcar Update Engine starting May 13 00:24:49.693617 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 13 00:24:49.694573 extend-filesystems[1450]: resize2fs 1.47.1 (20-May-2024) May 13 00:24:49.697558 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks May 13 00:24:49.700391 update_engine[1439]: I20250513 00:24:49.699860 1439 update_check_scheduler.cc:74] Next update check in 8m56s May 13 00:24:49.700556 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1361) May 13 00:24:49.705053 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 13 00:24:49.705292 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 13 00:24:49.705661 systemd[1]: motdgen.service: Deactivated successfully. May 13 00:24:49.705861 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 13 00:24:49.710955 jq[1448]: true May 13 00:24:49.711706 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 13 00:24:49.711916 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 13 00:24:49.725633 jq[1454]: true May 13 00:24:49.732556 kernel: EXT4-fs (vda9): resized filesystem to 1864699 May 13 00:24:49.765474 tar[1453]: linux-amd64/LICENSE May 13 00:24:49.735165 (ntainerd)[1455]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 13 00:24:49.754503 systemd[1]: Started update-engine.service - Update Engine. May 13 00:24:49.766592 tar[1453]: linux-amd64/helm May 13 00:24:49.756246 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 13 00:24:49.756269 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 13 00:24:49.757583 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 13 00:24:49.757598 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 13 00:24:49.765899 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 13 00:24:49.765927 systemd-logind[1436]: Watching system buttons on /dev/input/event1 (Power Button) May 13 00:24:49.765948 systemd-logind[1436]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 13 00:24:49.767206 systemd-logind[1436]: New seat seat0. May 13 00:24:49.768869 extend-filesystems[1450]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 13 00:24:49.768869 extend-filesystems[1450]: old_desc_blocks = 1, new_desc_blocks = 1 May 13 00:24:49.768869 extend-filesystems[1450]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. May 13 00:24:49.773897 extend-filesystems[1429]: Resized filesystem in /dev/vda9 May 13 00:24:49.771244 systemd[1]: Started systemd-logind.service - User Login Management. May 13 00:24:49.779840 systemd[1]: extend-filesystems.service: Deactivated successfully. May 13 00:24:49.780095 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 13 00:24:49.793591 bash[1481]: Updated "/home/core/.ssh/authorized_keys" May 13 00:24:49.793467 locksmithd[1480]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 13 00:24:49.796282 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 13 00:24:49.799218 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 13 00:24:49.939445 containerd[1455]: time="2025-05-13T00:24:49.939281674Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 May 13 00:24:49.961942 containerd[1455]: time="2025-05-13T00:24:49.961893727Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 May 13 00:24:49.963653 containerd[1455]: time="2025-05-13T00:24:49.963596932Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.89-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 May 13 00:24:49.963653 containerd[1455]: time="2025-05-13T00:24:49.963625516Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 May 13 00:24:49.963653 containerd[1455]: time="2025-05-13T00:24:49.963640454Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 May 13 00:24:49.963833 containerd[1455]: time="2025-05-13T00:24:49.963812927Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 May 13 00:24:49.963873 containerd[1455]: time="2025-05-13T00:24:49.963832995Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 May 13 00:24:49.963942 containerd[1455]: time="2025-05-13T00:24:49.963895252Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 May 13 00:24:49.963942 containerd[1455]: time="2025-05-13T00:24:49.963911192Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 May 13 00:24:49.964150 containerd[1455]: time="2025-05-13T00:24:49.964118200Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 May 13 00:24:49.964150 containerd[1455]: time="2025-05-13T00:24:49.964137386Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 May 13 00:24:49.964202 containerd[1455]: time="2025-05-13T00:24:49.964150220Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 May 13 00:24:49.964202 containerd[1455]: time="2025-05-13T00:24:49.964160359Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 May 13 00:24:49.964277 containerd[1455]: time="2025-05-13T00:24:49.964259605Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 May 13 00:24:49.964524 containerd[1455]: time="2025-05-13T00:24:49.964497782Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 May 13 00:24:49.964661 containerd[1455]: time="2025-05-13T00:24:49.964636803Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 May 13 00:24:49.964661 containerd[1455]: time="2025-05-13T00:24:49.964655828Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 May 13 00:24:49.964768 containerd[1455]: time="2025-05-13T00:24:49.964749424Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 May 13 00:24:49.964823 containerd[1455]: time="2025-05-13T00:24:49.964806762Z" level=info msg="metadata content store policy set" policy=shared May 13 00:24:49.985559 sshd_keygen[1447]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 13 00:24:50.009063 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 13 00:24:50.019865 systemd[1]: Starting issuegen.service - Generate /run/issue... May 13 00:24:50.028762 systemd[1]: issuegen.service: Deactivated successfully. May 13 00:24:50.029011 systemd[1]: Finished issuegen.service - Generate /run/issue. May 13 00:24:50.038830 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 13 00:24:50.054829 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 13 00:24:50.068771 systemd[1]: Started getty@tty1.service - Getty on tty1. May 13 00:24:50.071193 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 13 00:24:50.072563 systemd[1]: Reached target getty.target - Login Prompts. May 13 00:24:50.154001 containerd[1455]: time="2025-05-13T00:24:50.153950371Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 May 13 00:24:50.154137 containerd[1455]: time="2025-05-13T00:24:50.154021294Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 May 13 00:24:50.154137 containerd[1455]: time="2025-05-13T00:24:50.154043816Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 May 13 00:24:50.154137 containerd[1455]: time="2025-05-13T00:24:50.154063052Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 May 13 00:24:50.154137 containerd[1455]: time="2025-05-13T00:24:50.154103839Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 May 13 00:24:50.154325 containerd[1455]: time="2025-05-13T00:24:50.154304575Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154531280Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154665933Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154679128Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154695559Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154707661Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154719433Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154732448Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154745122Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154757324Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154770038Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154781620Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154792440Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154810474Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156562 containerd[1455]: time="2025-05-13T00:24:50.154822988Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154834339Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154845800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154857412Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154869946Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154881117Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154892127Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154908197Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154923977Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154934717Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154947040Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154958131Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154972398Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.154991844Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.155002725Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 May 13 00:24:50.156900 containerd[1455]: time="2025-05-13T00:24:50.155013134Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 May 13 00:24:50.157165 containerd[1455]: time="2025-05-13T00:24:50.155267602Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 May 13 00:24:50.157165 containerd[1455]: time="2025-05-13T00:24:50.155284944Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 May 13 00:24:50.157165 containerd[1455]: time="2025-05-13T00:24:50.155295644Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 May 13 00:24:50.157165 containerd[1455]: time="2025-05-13T00:24:50.155306334Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 May 13 00:24:50.157165 containerd[1455]: time="2025-05-13T00:24:50.155315171Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 May 13 00:24:50.157165 containerd[1455]: time="2025-05-13T00:24:50.155326031Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 May 13 00:24:50.157165 containerd[1455]: time="2025-05-13T00:24:50.155335339Z" level=info msg="NRI interface is disabled by configuration." May 13 00:24:50.157165 containerd[1455]: time="2025-05-13T00:24:50.155345838Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 May 13 00:24:50.157304 containerd[1455]: time="2025-05-13T00:24:50.155614061Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" May 13 00:24:50.157304 containerd[1455]: time="2025-05-13T00:24:50.155662532Z" level=info msg="Connect containerd service" May 13 00:24:50.157304 containerd[1455]: time="2025-05-13T00:24:50.155700013Z" level=info msg="using legacy CRI server" May 13 00:24:50.157304 containerd[1455]: time="2025-05-13T00:24:50.155708178Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 13 00:24:50.157304 containerd[1455]: time="2025-05-13T00:24:50.155789000Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" May 13 00:24:50.157304 containerd[1455]: time="2025-05-13T00:24:50.156366793Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 00:24:50.157756 containerd[1455]: time="2025-05-13T00:24:50.156768878Z" level=info msg="Start subscribing containerd event" May 13 00:24:50.159447 containerd[1455]: time="2025-05-13T00:24:50.159398019Z" level=info msg="Start recovering state" May 13 00:24:50.159781 containerd[1455]: time="2025-05-13T00:24:50.159520599Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 13 00:24:50.159781 containerd[1455]: time="2025-05-13T00:24:50.159561987Z" level=info msg="Start event monitor" May 13 00:24:50.159781 containerd[1455]: time="2025-05-13T00:24:50.159630144Z" level=info msg="Start snapshots syncer" May 13 00:24:50.159781 containerd[1455]: time="2025-05-13T00:24:50.159643790Z" level=info msg="Start cni network conf syncer for default" May 13 00:24:50.159781 containerd[1455]: time="2025-05-13T00:24:50.159653088Z" level=info msg="Start streaming server" May 13 00:24:50.159781 containerd[1455]: time="2025-05-13T00:24:50.159669228Z" level=info msg=serving... address=/run/containerd/containerd.sock May 13 00:24:50.159934 containerd[1455]: time="2025-05-13T00:24:50.159828577Z" level=info msg="containerd successfully booted in 0.221864s" May 13 00:24:50.159932 systemd[1]: Started containerd.service - containerd container runtime. May 13 00:24:50.162900 tar[1453]: linux-amd64/README.md May 13 00:24:50.178355 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 13 00:24:50.985779 systemd-networkd[1385]: eth0: Gained IPv6LL May 13 00:24:50.988970 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 13 00:24:50.996132 systemd[1]: Reached target network-online.target - Network is Online. May 13 00:24:51.007797 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... May 13 00:24:51.010508 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 00:24:51.012700 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 13 00:24:51.033178 systemd[1]: coreos-metadata.service: Deactivated successfully. May 13 00:24:51.033430 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. May 13 00:24:51.035329 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 13 00:24:51.037592 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 13 00:24:51.677344 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 00:24:51.678942 systemd[1]: Reached target multi-user.target - Multi-User System. May 13 00:24:51.682633 systemd[1]: Startup finished in 727ms (kernel) + 5.279s (initrd) + 4.217s (userspace) = 10.225s. May 13 00:24:51.697906 (kubelet)[1539]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 00:24:52.089949 kubelet[1539]: E0513 00:24:52.089763 1539 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 00:24:52.093567 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 00:24:52.093775 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 00:24:55.251877 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 13 00:24:55.253156 systemd[1]: Started sshd@0-10.0.0.89:22-10.0.0.1:53778.service - OpenSSH per-connection server daemon (10.0.0.1:53778). May 13 00:24:55.296803 sshd[1552]: Accepted publickey for core from 10.0.0.1 port 53778 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:24:55.298971 sshd[1552]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:24:55.307663 systemd-logind[1436]: New session 1 of user core. May 13 00:24:55.308928 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 13 00:24:55.320753 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 13 00:24:55.333262 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 13 00:24:55.335208 systemd[1]: Starting user@500.service - User Manager for UID 500... May 13 00:24:55.343698 (systemd)[1556]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 13 00:24:55.454828 systemd[1556]: Queued start job for default target default.target. May 13 00:24:55.472848 systemd[1556]: Created slice app.slice - User Application Slice. May 13 00:24:55.472875 systemd[1556]: Reached target paths.target - Paths. May 13 00:24:55.472889 systemd[1556]: Reached target timers.target - Timers. May 13 00:24:55.474503 systemd[1556]: Starting dbus.socket - D-Bus User Message Bus Socket... May 13 00:24:55.486603 systemd[1556]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 13 00:24:55.486771 systemd[1556]: Reached target sockets.target - Sockets. May 13 00:24:55.486792 systemd[1556]: Reached target basic.target - Basic System. May 13 00:24:55.486845 systemd[1556]: Reached target default.target - Main User Target. May 13 00:24:55.486883 systemd[1556]: Startup finished in 136ms. May 13 00:24:55.487137 systemd[1]: Started user@500.service - User Manager for UID 500. May 13 00:24:55.494674 systemd[1]: Started session-1.scope - Session 1 of User core. May 13 00:24:55.555693 systemd[1]: Started sshd@1-10.0.0.89:22-10.0.0.1:53782.service - OpenSSH per-connection server daemon (10.0.0.1:53782). May 13 00:24:55.589005 sshd[1567]: Accepted publickey for core from 10.0.0.1 port 53782 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:24:55.590463 sshd[1567]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:24:55.594393 systemd-logind[1436]: New session 2 of user core. May 13 00:24:55.603679 systemd[1]: Started session-2.scope - Session 2 of User core. May 13 00:24:55.656783 sshd[1567]: pam_unix(sshd:session): session closed for user core May 13 00:24:55.667605 systemd[1]: sshd@1-10.0.0.89:22-10.0.0.1:53782.service: Deactivated successfully. May 13 00:24:55.669342 systemd[1]: session-2.scope: Deactivated successfully. May 13 00:24:55.670988 systemd-logind[1436]: Session 2 logged out. Waiting for processes to exit. May 13 00:24:55.672250 systemd[1]: Started sshd@2-10.0.0.89:22-10.0.0.1:53784.service - OpenSSH per-connection server daemon (10.0.0.1:53784). May 13 00:24:55.672994 systemd-logind[1436]: Removed session 2. May 13 00:24:55.704712 sshd[1574]: Accepted publickey for core from 10.0.0.1 port 53784 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:24:55.706139 sshd[1574]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:24:55.710199 systemd-logind[1436]: New session 3 of user core. May 13 00:24:55.720661 systemd[1]: Started session-3.scope - Session 3 of User core. May 13 00:24:55.769613 sshd[1574]: pam_unix(sshd:session): session closed for user core May 13 00:24:55.782454 systemd[1]: sshd@2-10.0.0.89:22-10.0.0.1:53784.service: Deactivated successfully. May 13 00:24:55.784273 systemd[1]: session-3.scope: Deactivated successfully. May 13 00:24:55.785627 systemd-logind[1436]: Session 3 logged out. Waiting for processes to exit. May 13 00:24:55.786882 systemd[1]: Started sshd@3-10.0.0.89:22-10.0.0.1:53792.service - OpenSSH per-connection server daemon (10.0.0.1:53792). May 13 00:24:55.787862 systemd-logind[1436]: Removed session 3. May 13 00:24:55.819022 sshd[1581]: Accepted publickey for core from 10.0.0.1 port 53792 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:24:55.820485 sshd[1581]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:24:55.824260 systemd-logind[1436]: New session 4 of user core. May 13 00:24:55.834658 systemd[1]: Started session-4.scope - Session 4 of User core. May 13 00:24:55.887722 sshd[1581]: pam_unix(sshd:session): session closed for user core May 13 00:24:55.897279 systemd[1]: sshd@3-10.0.0.89:22-10.0.0.1:53792.service: Deactivated successfully. May 13 00:24:55.899048 systemd[1]: session-4.scope: Deactivated successfully. May 13 00:24:55.900720 systemd-logind[1436]: Session 4 logged out. Waiting for processes to exit. May 13 00:24:55.909767 systemd[1]: Started sshd@4-10.0.0.89:22-10.0.0.1:53808.service - OpenSSH per-connection server daemon (10.0.0.1:53808). May 13 00:24:55.910642 systemd-logind[1436]: Removed session 4. May 13 00:24:55.937961 sshd[1588]: Accepted publickey for core from 10.0.0.1 port 53808 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:24:55.939522 sshd[1588]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:24:55.943169 systemd-logind[1436]: New session 5 of user core. May 13 00:24:55.952644 systemd[1]: Started session-5.scope - Session 5 of User core. May 13 00:24:56.012069 sudo[1591]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 13 00:24:56.012507 sudo[1591]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 00:24:56.029621 sudo[1591]: pam_unix(sudo:session): session closed for user root May 13 00:24:56.031665 sshd[1588]: pam_unix(sshd:session): session closed for user core May 13 00:24:56.045379 systemd[1]: sshd@4-10.0.0.89:22-10.0.0.1:53808.service: Deactivated successfully. May 13 00:24:56.047269 systemd[1]: session-5.scope: Deactivated successfully. May 13 00:24:56.048616 systemd-logind[1436]: Session 5 logged out. Waiting for processes to exit. May 13 00:24:56.060758 systemd[1]: Started sshd@5-10.0.0.89:22-10.0.0.1:53816.service - OpenSSH per-connection server daemon (10.0.0.1:53816). May 13 00:24:56.061666 systemd-logind[1436]: Removed session 5. May 13 00:24:56.089646 sshd[1596]: Accepted publickey for core from 10.0.0.1 port 53816 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:24:56.091060 sshd[1596]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:24:56.094673 systemd-logind[1436]: New session 6 of user core. May 13 00:24:56.104657 systemd[1]: Started session-6.scope - Session 6 of User core. May 13 00:24:56.157498 sudo[1600]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 13 00:24:56.157863 sudo[1600]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 00:24:56.161329 sudo[1600]: pam_unix(sudo:session): session closed for user root May 13 00:24:56.167421 sudo[1599]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules May 13 00:24:56.167764 sudo[1599]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 00:24:56.193757 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... May 13 00:24:56.195334 auditctl[1603]: No rules May 13 00:24:56.195758 systemd[1]: audit-rules.service: Deactivated successfully. May 13 00:24:56.195976 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. May 13 00:24:56.198499 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... May 13 00:24:56.227012 augenrules[1621]: No rules May 13 00:24:56.228743 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. May 13 00:24:56.229916 sudo[1599]: pam_unix(sudo:session): session closed for user root May 13 00:24:56.231567 sshd[1596]: pam_unix(sshd:session): session closed for user core May 13 00:24:56.243131 systemd[1]: sshd@5-10.0.0.89:22-10.0.0.1:53816.service: Deactivated successfully. May 13 00:24:56.244786 systemd[1]: session-6.scope: Deactivated successfully. May 13 00:24:56.246342 systemd-logind[1436]: Session 6 logged out. Waiting for processes to exit. May 13 00:24:56.252901 systemd[1]: Started sshd@6-10.0.0.89:22-10.0.0.1:53830.service - OpenSSH per-connection server daemon (10.0.0.1:53830). May 13 00:24:56.253939 systemd-logind[1436]: Removed session 6. May 13 00:24:56.280165 sshd[1629]: Accepted publickey for core from 10.0.0.1 port 53830 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:24:56.281597 sshd[1629]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:24:56.285414 systemd-logind[1436]: New session 7 of user core. May 13 00:24:56.299673 systemd[1]: Started session-7.scope - Session 7 of User core. May 13 00:24:56.352398 sudo[1633]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 13 00:24:56.352753 sudo[1633]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 00:24:56.637809 systemd[1]: Starting docker.service - Docker Application Container Engine... May 13 00:24:56.637886 (dockerd)[1651]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 13 00:24:56.913018 dockerd[1651]: time="2025-05-13T00:24:56.912867748Z" level=info msg="Starting up" May 13 00:24:57.511591 dockerd[1651]: time="2025-05-13T00:24:57.511527903Z" level=info msg="Loading containers: start." May 13 00:24:57.614565 kernel: Initializing XFRM netlink socket May 13 00:24:57.688705 systemd-networkd[1385]: docker0: Link UP May 13 00:24:57.707949 dockerd[1651]: time="2025-05-13T00:24:57.707908367Z" level=info msg="Loading containers: done." May 13 00:24:57.722162 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2135863370-merged.mount: Deactivated successfully. May 13 00:24:57.724945 dockerd[1651]: time="2025-05-13T00:24:57.724898095Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 13 00:24:57.725030 dockerd[1651]: time="2025-05-13T00:24:57.725009313Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 May 13 00:24:57.725146 dockerd[1651]: time="2025-05-13T00:24:57.725126032Z" level=info msg="Daemon has completed initialization" May 13 00:24:57.764550 dockerd[1651]: time="2025-05-13T00:24:57.764379140Z" level=info msg="API listen on /run/docker.sock" May 13 00:24:57.764642 systemd[1]: Started docker.service - Docker Application Container Engine. May 13 00:24:58.563132 containerd[1455]: time="2025-05-13T00:24:58.563095033Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.4\"" May 13 00:24:59.513912 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3934244436.mount: Deactivated successfully. May 13 00:25:00.620837 containerd[1455]: time="2025-05-13T00:25:00.620761236Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:00.651367 containerd[1455]: time="2025-05-13T00:25:00.651295049Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.4: active requests=0, bytes read=28682879" May 13 00:25:00.668777 containerd[1455]: time="2025-05-13T00:25:00.668732386Z" level=info msg="ImageCreate event name:\"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:00.672081 containerd[1455]: time="2025-05-13T00:25:00.672032426Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:631c6cc78b2862be4fed7df3384a643ef7297eebadae22e8ef9cbe2e19b6386f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:00.673133 containerd[1455]: time="2025-05-13T00:25:00.673085662Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.4\" with image id \"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:631c6cc78b2862be4fed7df3384a643ef7297eebadae22e8ef9cbe2e19b6386f\", size \"28679679\" in 2.109950464s" May 13 00:25:00.673133 containerd[1455]: time="2025-05-13T00:25:00.673123673Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.4\" returns image reference \"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\"" May 13 00:25:00.673855 containerd[1455]: time="2025-05-13T00:25:00.673812575Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.4\"" May 13 00:25:02.025170 containerd[1455]: time="2025-05-13T00:25:02.025103581Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:02.053226 containerd[1455]: time="2025-05-13T00:25:02.053175736Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.4: active requests=0, bytes read=24779589" May 13 00:25:02.079468 containerd[1455]: time="2025-05-13T00:25:02.079406177Z" level=info msg="ImageCreate event name:\"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:02.130370 containerd[1455]: time="2025-05-13T00:25:02.130302654Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:25e29187ea66f0ff9b9a00114849c3a30b649005c900a8b2a69e3f3fa56448fb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:02.131354 containerd[1455]: time="2025-05-13T00:25:02.131310505Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.4\" with image id \"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:25e29187ea66f0ff9b9a00114849c3a30b649005c900a8b2a69e3f3fa56448fb\", size \"26267962\" in 1.457459748s" May 13 00:25:02.131393 containerd[1455]: time="2025-05-13T00:25:02.131358124Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.4\" returns image reference \"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\"" May 13 00:25:02.131949 containerd[1455]: time="2025-05-13T00:25:02.131837904Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.4\"" May 13 00:25:02.243925 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 13 00:25:02.258711 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 00:25:02.420858 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 00:25:02.426462 (kubelet)[1866]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 00:25:02.469666 kubelet[1866]: E0513 00:25:02.469507 1866 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 00:25:02.476975 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 00:25:02.477183 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 00:25:04.397323 containerd[1455]: time="2025-05-13T00:25:04.397263285Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:04.398277 containerd[1455]: time="2025-05-13T00:25:04.398231260Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.4: active requests=0, bytes read=19169938" May 13 00:25:04.400023 containerd[1455]: time="2025-05-13T00:25:04.399973929Z" level=info msg="ImageCreate event name:\"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:04.404267 containerd[1455]: time="2025-05-13T00:25:04.402830337Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:09c55f8dac59a4b8e5e354140f5a4bdd6fa9bd95c42d6bcba6782ed37c31b5a2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:04.405421 containerd[1455]: time="2025-05-13T00:25:04.405373677Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.4\" with image id \"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:09c55f8dac59a4b8e5e354140f5a4bdd6fa9bd95c42d6bcba6782ed37c31b5a2\", size \"20658329\" in 2.273490779s" May 13 00:25:04.405464 containerd[1455]: time="2025-05-13T00:25:04.405436616Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.4\" returns image reference \"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\"" May 13 00:25:04.406010 containerd[1455]: time="2025-05-13T00:25:04.405954737Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.4\"" May 13 00:25:05.367036 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount598706027.mount: Deactivated successfully. May 13 00:25:05.637411 containerd[1455]: time="2025-05-13T00:25:05.637294088Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:05.638282 containerd[1455]: time="2025-05-13T00:25:05.638240483Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.4: active requests=0, bytes read=30917856" May 13 00:25:05.639516 containerd[1455]: time="2025-05-13T00:25:05.639485739Z" level=info msg="ImageCreate event name:\"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:05.641565 containerd[1455]: time="2025-05-13T00:25:05.641507321Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:152638222ecf265eb8e5352e3c50e8fc520994e8ffcff1ee1490c975f7fc2b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:05.642120 containerd[1455]: time="2025-05-13T00:25:05.642075647Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.4\" with image id \"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\", repo tag \"registry.k8s.io/kube-proxy:v1.32.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:152638222ecf265eb8e5352e3c50e8fc520994e8ffcff1ee1490c975f7fc2b36\", size \"30916875\" in 1.236093939s" May 13 00:25:05.642149 containerd[1455]: time="2025-05-13T00:25:05.642121112Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.4\" returns image reference \"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\"" May 13 00:25:05.642647 containerd[1455]: time="2025-05-13T00:25:05.642623995Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 13 00:25:06.170353 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3298549569.mount: Deactivated successfully. May 13 00:25:07.225999 containerd[1455]: time="2025-05-13T00:25:07.225940635Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:07.226728 containerd[1455]: time="2025-05-13T00:25:07.226657760Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" May 13 00:25:07.227825 containerd[1455]: time="2025-05-13T00:25:07.227794472Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:07.230786 containerd[1455]: time="2025-05-13T00:25:07.230739907Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:07.231944 containerd[1455]: time="2025-05-13T00:25:07.231908489Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.589253616s" May 13 00:25:07.231975 containerd[1455]: time="2025-05-13T00:25:07.231941841Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 13 00:25:07.232405 containerd[1455]: time="2025-05-13T00:25:07.232382007Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 13 00:25:07.714193 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2990122193.mount: Deactivated successfully. May 13 00:25:07.720302 containerd[1455]: time="2025-05-13T00:25:07.720258534Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:07.721043 containerd[1455]: time="2025-05-13T00:25:07.721007409Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 13 00:25:07.722250 containerd[1455]: time="2025-05-13T00:25:07.722224872Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:07.724462 containerd[1455]: time="2025-05-13T00:25:07.724398239Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:07.725117 containerd[1455]: time="2025-05-13T00:25:07.725085328Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 492.676981ms" May 13 00:25:07.725117 containerd[1455]: time="2025-05-13T00:25:07.725114653Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 13 00:25:07.725597 containerd[1455]: time="2025-05-13T00:25:07.725574345Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 13 00:25:08.250514 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3651995383.mount: Deactivated successfully. May 13 00:25:09.845410 containerd[1455]: time="2025-05-13T00:25:09.845340250Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:09.846246 containerd[1455]: time="2025-05-13T00:25:09.846184553Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551360" May 13 00:25:09.847470 containerd[1455]: time="2025-05-13T00:25:09.847423317Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:09.850384 containerd[1455]: time="2025-05-13T00:25:09.850337814Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:09.851569 containerd[1455]: time="2025-05-13T00:25:09.851514070Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.125824028s" May 13 00:25:09.851601 containerd[1455]: time="2025-05-13T00:25:09.851567711Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" May 13 00:25:11.857446 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 00:25:11.867753 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 00:25:11.890184 systemd[1]: Reloading requested from client PID 2028 ('systemctl') (unit session-7.scope)... May 13 00:25:11.890202 systemd[1]: Reloading... May 13 00:25:11.976567 zram_generator::config[2073]: No configuration found. May 13 00:25:12.149744 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 00:25:12.226224 systemd[1]: Reloading finished in 335 ms. May 13 00:25:12.273487 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 13 00:25:12.273595 systemd[1]: kubelet.service: Failed with result 'signal'. May 13 00:25:12.273847 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 00:25:12.275384 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 00:25:12.432813 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 00:25:12.438793 (kubelet)[2115]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 00:25:12.473739 kubelet[2115]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 00:25:12.473739 kubelet[2115]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 13 00:25:12.473739 kubelet[2115]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 00:25:12.474131 kubelet[2115]: I0513 00:25:12.473797 2115 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 00:25:12.796245 kubelet[2115]: I0513 00:25:12.796185 2115 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" May 13 00:25:12.796245 kubelet[2115]: I0513 00:25:12.796229 2115 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 00:25:12.796628 kubelet[2115]: I0513 00:25:12.796599 2115 server.go:954] "Client rotation is on, will bootstrap in background" May 13 00:25:12.819154 kubelet[2115]: E0513 00:25:12.818978 2115 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.89:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.89:6443: connect: connection refused" logger="UnhandledError" May 13 00:25:12.820400 kubelet[2115]: I0513 00:25:12.820369 2115 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 00:25:12.829522 kubelet[2115]: E0513 00:25:12.829469 2115 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" May 13 00:25:12.829522 kubelet[2115]: I0513 00:25:12.829504 2115 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." May 13 00:25:12.834294 kubelet[2115]: I0513 00:25:12.834245 2115 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 00:25:12.835362 kubelet[2115]: I0513 00:25:12.835309 2115 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 00:25:12.835521 kubelet[2115]: I0513 00:25:12.835341 2115 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 00:25:12.835521 kubelet[2115]: I0513 00:25:12.835509 2115 topology_manager.go:138] "Creating topology manager with none policy" May 13 00:25:12.835521 kubelet[2115]: I0513 00:25:12.835518 2115 container_manager_linux.go:304] "Creating device plugin manager" May 13 00:25:12.835697 kubelet[2115]: I0513 00:25:12.835654 2115 state_mem.go:36] "Initialized new in-memory state store" May 13 00:25:12.838063 kubelet[2115]: I0513 00:25:12.838026 2115 kubelet.go:446] "Attempting to sync node with API server" May 13 00:25:12.838063 kubelet[2115]: I0513 00:25:12.838046 2115 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 00:25:12.838063 kubelet[2115]: I0513 00:25:12.838060 2115 kubelet.go:352] "Adding apiserver pod source" May 13 00:25:12.838063 kubelet[2115]: I0513 00:25:12.838070 2115 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 00:25:12.840886 kubelet[2115]: I0513 00:25:12.840847 2115 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" May 13 00:25:12.841187 kubelet[2115]: I0513 00:25:12.841163 2115 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 00:25:12.842583 kubelet[2115]: W0513 00:25:12.842274 2115 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 13 00:25:12.843035 kubelet[2115]: W0513 00:25:12.842978 2115 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.89:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.89:6443: connect: connection refused May 13 00:25:12.843083 kubelet[2115]: E0513 00:25:12.843032 2115 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.89:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.89:6443: connect: connection refused" logger="UnhandledError" May 13 00:25:12.844289 kubelet[2115]: W0513 00:25:12.844235 2115 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.89:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.89:6443: connect: connection refused May 13 00:25:12.844412 kubelet[2115]: E0513 00:25:12.844378 2115 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.89:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.89:6443: connect: connection refused" logger="UnhandledError" May 13 00:25:12.845565 kubelet[2115]: I0513 00:25:12.845531 2115 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 13 00:25:12.845612 kubelet[2115]: I0513 00:25:12.845574 2115 server.go:1287] "Started kubelet" May 13 00:25:12.845752 kubelet[2115]: I0513 00:25:12.845692 2115 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 13 00:25:12.845781 kubelet[2115]: I0513 00:25:12.845705 2115 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 00:25:12.846450 kubelet[2115]: I0513 00:25:12.846319 2115 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 00:25:12.848007 kubelet[2115]: I0513 00:25:12.847787 2115 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 00:25:12.849260 kubelet[2115]: I0513 00:25:12.848997 2115 server.go:490] "Adding debug handlers to kubelet server" May 13 00:25:12.850576 kubelet[2115]: I0513 00:25:12.850552 2115 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 00:25:12.851862 kubelet[2115]: I0513 00:25:12.851730 2115 volume_manager.go:297] "Starting Kubelet Volume Manager" May 13 00:25:12.851923 kubelet[2115]: E0513 00:25:12.851912 2115 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 00:25:12.852222 kubelet[2115]: I0513 00:25:12.852204 2115 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 13 00:25:12.852266 kubelet[2115]: I0513 00:25:12.852251 2115 reconciler.go:26] "Reconciler: start to sync state" May 13 00:25:12.853168 kubelet[2115]: E0513 00:25:12.852684 2115 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.89:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.89:6443: connect: connection refused" interval="200ms" May 13 00:25:12.853168 kubelet[2115]: W0513 00:25:12.852757 2115 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.89:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.89:6443: connect: connection refused May 13 00:25:12.853168 kubelet[2115]: E0513 00:25:12.852797 2115 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.89:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.89:6443: connect: connection refused" logger="UnhandledError" May 13 00:25:12.853561 kubelet[2115]: E0513 00:25:12.851567 2115 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.89:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.89:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.183eee79e9b23c0c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-13 00:25:12.845556748 +0000 UTC m=+0.402793506,LastTimestamp:2025-05-13 00:25:12.845556748 +0000 UTC m=+0.402793506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 13 00:25:12.854151 kubelet[2115]: E0513 00:25:12.854015 2115 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 00:25:12.855051 kubelet[2115]: I0513 00:25:12.855030 2115 factory.go:221] Registration of the containerd container factory successfully May 13 00:25:12.855051 kubelet[2115]: I0513 00:25:12.855047 2115 factory.go:221] Registration of the systemd container factory successfully May 13 00:25:12.855173 kubelet[2115]: I0513 00:25:12.855135 2115 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 00:25:12.865595 kubelet[2115]: I0513 00:25:12.865550 2115 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 00:25:12.866709 kubelet[2115]: I0513 00:25:12.866694 2115 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 00:25:12.867032 kubelet[2115]: I0513 00:25:12.866775 2115 status_manager.go:227] "Starting to sync pod status with apiserver" May 13 00:25:12.867032 kubelet[2115]: I0513 00:25:12.866804 2115 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 13 00:25:12.867032 kubelet[2115]: I0513 00:25:12.866812 2115 kubelet.go:2388] "Starting kubelet main sync loop" May 13 00:25:12.867032 kubelet[2115]: E0513 00:25:12.866859 2115 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 00:25:12.870498 kubelet[2115]: W0513 00:25:12.870425 2115 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.89:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.89:6443: connect: connection refused May 13 00:25:12.870498 kubelet[2115]: E0513 00:25:12.870484 2115 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.89:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.89:6443: connect: connection refused" logger="UnhandledError" May 13 00:25:12.870965 kubelet[2115]: I0513 00:25:12.870948 2115 cpu_manager.go:221] "Starting CPU manager" policy="none" May 13 00:25:12.870965 kubelet[2115]: I0513 00:25:12.870960 2115 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 13 00:25:12.871046 kubelet[2115]: I0513 00:25:12.870976 2115 state_mem.go:36] "Initialized new in-memory state store" May 13 00:25:12.952381 kubelet[2115]: E0513 00:25:12.952339 2115 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 00:25:12.967951 kubelet[2115]: E0513 00:25:12.967908 2115 kubelet.go:2412] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 00:25:13.053325 kubelet[2115]: E0513 00:25:13.053227 2115 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 00:25:13.053524 kubelet[2115]: E0513 00:25:13.053497 2115 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.89:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.89:6443: connect: connection refused" interval="400ms" May 13 00:25:13.154146 kubelet[2115]: E0513 00:25:13.154115 2115 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 00:25:13.168330 kubelet[2115]: E0513 00:25:13.168299 2115 kubelet.go:2412] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 00:25:13.230485 kubelet[2115]: I0513 00:25:13.230427 2115 policy_none.go:49] "None policy: Start" May 13 00:25:13.230485 kubelet[2115]: I0513 00:25:13.230473 2115 memory_manager.go:186] "Starting memorymanager" policy="None" May 13 00:25:13.230485 kubelet[2115]: I0513 00:25:13.230487 2115 state_mem.go:35] "Initializing new in-memory state store" May 13 00:25:13.237799 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 13 00:25:13.251640 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 13 00:25:13.254198 kubelet[2115]: E0513 00:25:13.254164 2115 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 00:25:13.254455 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 13 00:25:13.268495 kubelet[2115]: I0513 00:25:13.268456 2115 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 00:25:13.268787 kubelet[2115]: I0513 00:25:13.268682 2115 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 00:25:13.268787 kubelet[2115]: I0513 00:25:13.268699 2115 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 00:25:13.269194 kubelet[2115]: I0513 00:25:13.268918 2115 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 00:25:13.269611 kubelet[2115]: E0513 00:25:13.269587 2115 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 13 00:25:13.269648 kubelet[2115]: E0513 00:25:13.269618 2115 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 13 00:25:13.370208 kubelet[2115]: I0513 00:25:13.370106 2115 kubelet_node_status.go:76] "Attempting to register node" node="localhost" May 13 00:25:13.370581 kubelet[2115]: E0513 00:25:13.370523 2115 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.0.0.89:6443/api/v1/nodes\": dial tcp 10.0.0.89:6443: connect: connection refused" node="localhost" May 13 00:25:13.454206 kubelet[2115]: E0513 00:25:13.454171 2115 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.89:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.89:6443: connect: connection refused" interval="800ms" May 13 00:25:13.571485 kubelet[2115]: I0513 00:25:13.571448 2115 kubelet_node_status.go:76] "Attempting to register node" node="localhost" May 13 00:25:13.571852 kubelet[2115]: E0513 00:25:13.571671 2115 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.0.0.89:6443/api/v1/nodes\": dial tcp 10.0.0.89:6443: connect: connection refused" node="localhost" May 13 00:25:13.575989 systemd[1]: Created slice kubepods-burstable-pod2980a8ab51edc665be10a02e33130e15.slice - libcontainer container kubepods-burstable-pod2980a8ab51edc665be10a02e33130e15.slice. May 13 00:25:13.591966 kubelet[2115]: E0513 00:25:13.591942 2115 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 13 00:25:13.594503 systemd[1]: Created slice kubepods-burstable-pod5386fe11ed933ab82453de11903c7f47.slice - libcontainer container kubepods-burstable-pod5386fe11ed933ab82453de11903c7f47.slice. May 13 00:25:13.604576 kubelet[2115]: E0513 00:25:13.604530 2115 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 13 00:25:13.607052 systemd[1]: Created slice kubepods-burstable-pod78458cefcff8c38d49901c8d241f4385.slice - libcontainer container kubepods-burstable-pod78458cefcff8c38d49901c8d241f4385.slice. May 13 00:25:13.608504 kubelet[2115]: E0513 00:25:13.608475 2115 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 13 00:25:13.655788 kubelet[2115]: I0513 00:25:13.655718 2115 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/78458cefcff8c38d49901c8d241f4385-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"78458cefcff8c38d49901c8d241f4385\") " pod="kube-system/kube-apiserver-localhost" May 13 00:25:13.655788 kubelet[2115]: I0513 00:25:13.655744 2115 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5386fe11ed933ab82453de11903c7f47-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"5386fe11ed933ab82453de11903c7f47\") " pod="kube-system/kube-controller-manager-localhost" May 13 00:25:13.655788 kubelet[2115]: I0513 00:25:13.655761 2115 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5386fe11ed933ab82453de11903c7f47-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"5386fe11ed933ab82453de11903c7f47\") " pod="kube-system/kube-controller-manager-localhost" May 13 00:25:13.655788 kubelet[2115]: I0513 00:25:13.655777 2115 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/78458cefcff8c38d49901c8d241f4385-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"78458cefcff8c38d49901c8d241f4385\") " pod="kube-system/kube-apiserver-localhost" May 13 00:25:13.655905 kubelet[2115]: I0513 00:25:13.655807 2115 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/78458cefcff8c38d49901c8d241f4385-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"78458cefcff8c38d49901c8d241f4385\") " pod="kube-system/kube-apiserver-localhost" May 13 00:25:13.655905 kubelet[2115]: I0513 00:25:13.655823 2115 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5386fe11ed933ab82453de11903c7f47-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"5386fe11ed933ab82453de11903c7f47\") " pod="kube-system/kube-controller-manager-localhost" May 13 00:25:13.655905 kubelet[2115]: I0513 00:25:13.655838 2115 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5386fe11ed933ab82453de11903c7f47-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"5386fe11ed933ab82453de11903c7f47\") " pod="kube-system/kube-controller-manager-localhost" May 13 00:25:13.655905 kubelet[2115]: I0513 00:25:13.655853 2115 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5386fe11ed933ab82453de11903c7f47-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"5386fe11ed933ab82453de11903c7f47\") " pod="kube-system/kube-controller-manager-localhost" May 13 00:25:13.655905 kubelet[2115]: I0513 00:25:13.655868 2115 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2980a8ab51edc665be10a02e33130e15-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"2980a8ab51edc665be10a02e33130e15\") " pod="kube-system/kube-scheduler-localhost" May 13 00:25:13.664151 kubelet[2115]: W0513 00:25:13.664130 2115 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.89:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.89:6443: connect: connection refused May 13 00:25:13.664206 kubelet[2115]: E0513 00:25:13.664159 2115 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.89:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.89:6443: connect: connection refused" logger="UnhandledError" May 13 00:25:13.702734 kubelet[2115]: W0513 00:25:13.702689 2115 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.89:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.89:6443: connect: connection refused May 13 00:25:13.702768 kubelet[2115]: E0513 00:25:13.702731 2115 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.89:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.89:6443: connect: connection refused" logger="UnhandledError" May 13 00:25:13.846758 kubelet[2115]: W0513 00:25:13.846714 2115 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.89:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.89:6443: connect: connection refused May 13 00:25:13.846899 kubelet[2115]: E0513 00:25:13.846757 2115 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.89:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.89:6443: connect: connection refused" logger="UnhandledError" May 13 00:25:13.892617 kubelet[2115]: E0513 00:25:13.892584 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:13.893227 containerd[1455]: time="2025-05-13T00:25:13.893174089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:2980a8ab51edc665be10a02e33130e15,Namespace:kube-system,Attempt:0,}" May 13 00:25:13.900687 kubelet[2115]: W0513 00:25:13.900658 2115 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.89:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.89:6443: connect: connection refused May 13 00:25:13.900739 kubelet[2115]: E0513 00:25:13.900691 2115 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.89:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.89:6443: connect: connection refused" logger="UnhandledError" May 13 00:25:13.905094 kubelet[2115]: E0513 00:25:13.905063 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:13.905579 containerd[1455]: time="2025-05-13T00:25:13.905521068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:5386fe11ed933ab82453de11903c7f47,Namespace:kube-system,Attempt:0,}" May 13 00:25:13.909762 kubelet[2115]: E0513 00:25:13.909655 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:13.910001 containerd[1455]: time="2025-05-13T00:25:13.909969853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:78458cefcff8c38d49901c8d241f4385,Namespace:kube-system,Attempt:0,}" May 13 00:25:13.933697 kubelet[2115]: E0513 00:25:13.933608 2115 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.89:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.89:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.183eee79e9b23c0c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-13 00:25:12.845556748 +0000 UTC m=+0.402793506,LastTimestamp:2025-05-13 00:25:12.845556748 +0000 UTC m=+0.402793506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 13 00:25:13.972915 kubelet[2115]: I0513 00:25:13.972883 2115 kubelet_node_status.go:76] "Attempting to register node" node="localhost" May 13 00:25:13.973412 kubelet[2115]: E0513 00:25:13.973377 2115 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.0.0.89:6443/api/v1/nodes\": dial tcp 10.0.0.89:6443: connect: connection refused" node="localhost" May 13 00:25:14.254908 kubelet[2115]: E0513 00:25:14.254862 2115 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.89:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.89:6443: connect: connection refused" interval="1.6s" May 13 00:25:14.405689 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4044157856.mount: Deactivated successfully. May 13 00:25:14.411193 containerd[1455]: time="2025-05-13T00:25:14.411148020Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 00:25:14.413181 containerd[1455]: time="2025-05-13T00:25:14.413122824Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" May 13 00:25:14.414719 containerd[1455]: time="2025-05-13T00:25:14.414689433Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 00:25:14.415823 containerd[1455]: time="2025-05-13T00:25:14.415797942Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 00:25:14.417335 containerd[1455]: time="2025-05-13T00:25:14.417300591Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 00:25:14.418674 containerd[1455]: time="2025-05-13T00:25:14.418634763Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" May 13 00:25:14.420106 containerd[1455]: time="2025-05-13T00:25:14.420046301Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" May 13 00:25:14.422681 containerd[1455]: time="2025-05-13T00:25:14.422640567Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 00:25:14.424421 containerd[1455]: time="2025-05-13T00:25:14.424377535Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 531.133655ms" May 13 00:25:14.425066 containerd[1455]: time="2025-05-13T00:25:14.425039787Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 519.406589ms" May 13 00:25:14.425754 containerd[1455]: time="2025-05-13T00:25:14.425728629Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 515.714503ms" May 13 00:25:14.581878 containerd[1455]: time="2025-05-13T00:25:14.581673637Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:25:14.581878 containerd[1455]: time="2025-05-13T00:25:14.581730504Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:25:14.581878 containerd[1455]: time="2025-05-13T00:25:14.581745582Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:14.582026 containerd[1455]: time="2025-05-13T00:25:14.581973690Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:25:14.582729 containerd[1455]: time="2025-05-13T00:25:14.582628388Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:14.583670 containerd[1455]: time="2025-05-13T00:25:14.583357125Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:25:14.583670 containerd[1455]: time="2025-05-13T00:25:14.583388224Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:14.583670 containerd[1455]: time="2025-05-13T00:25:14.583498200Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:14.584217 containerd[1455]: time="2025-05-13T00:25:14.583439820Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:25:14.584217 containerd[1455]: time="2025-05-13T00:25:14.583488993Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:25:14.584217 containerd[1455]: time="2025-05-13T00:25:14.583503239Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:14.584596 containerd[1455]: time="2025-05-13T00:25:14.584528162Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:14.602725 systemd[1]: Started cri-containerd-6d5f696cb29600805a12b05b8ad0fcbb64303acf793faf4fc834cdc7391e527e.scope - libcontainer container 6d5f696cb29600805a12b05b8ad0fcbb64303acf793faf4fc834cdc7391e527e. May 13 00:25:14.606671 systemd[1]: Started cri-containerd-26932c0cdc4ea545f45ca1f5c475ed39e72901354921a27d228bb35d1aa90de8.scope - libcontainer container 26932c0cdc4ea545f45ca1f5c475ed39e72901354921a27d228bb35d1aa90de8. May 13 00:25:14.608248 systemd[1]: Started cri-containerd-53a6d1a6b53c2692a25aee5db458371b6451d9e2348943899276c91de34a6eed.scope - libcontainer container 53a6d1a6b53c2692a25aee5db458371b6451d9e2348943899276c91de34a6eed. May 13 00:25:14.643224 containerd[1455]: time="2025-05-13T00:25:14.643092412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:5386fe11ed933ab82453de11903c7f47,Namespace:kube-system,Attempt:0,} returns sandbox id \"6d5f696cb29600805a12b05b8ad0fcbb64303acf793faf4fc834cdc7391e527e\"" May 13 00:25:14.644517 kubelet[2115]: E0513 00:25:14.644468 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:14.647476 containerd[1455]: time="2025-05-13T00:25:14.647315803Z" level=info msg="CreateContainer within sandbox \"6d5f696cb29600805a12b05b8ad0fcbb64303acf793faf4fc834cdc7391e527e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 13 00:25:14.650760 containerd[1455]: time="2025-05-13T00:25:14.650631693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:2980a8ab51edc665be10a02e33130e15,Namespace:kube-system,Attempt:0,} returns sandbox id \"53a6d1a6b53c2692a25aee5db458371b6451d9e2348943899276c91de34a6eed\"" May 13 00:25:14.651096 containerd[1455]: time="2025-05-13T00:25:14.650832059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:78458cefcff8c38d49901c8d241f4385,Namespace:kube-system,Attempt:0,} returns sandbox id \"26932c0cdc4ea545f45ca1f5c475ed39e72901354921a27d228bb35d1aa90de8\"" May 13 00:25:14.651424 kubelet[2115]: E0513 00:25:14.651393 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:14.652104 kubelet[2115]: E0513 00:25:14.652067 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:14.653471 containerd[1455]: time="2025-05-13T00:25:14.653450941Z" level=info msg="CreateContainer within sandbox \"53a6d1a6b53c2692a25aee5db458371b6451d9e2348943899276c91de34a6eed\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 13 00:25:14.654654 containerd[1455]: time="2025-05-13T00:25:14.654606128Z" level=info msg="CreateContainer within sandbox \"26932c0cdc4ea545f45ca1f5c475ed39e72901354921a27d228bb35d1aa90de8\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 13 00:25:14.668686 containerd[1455]: time="2025-05-13T00:25:14.668660620Z" level=info msg="CreateContainer within sandbox \"6d5f696cb29600805a12b05b8ad0fcbb64303acf793faf4fc834cdc7391e527e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"56f2ee85bce65a43b916d04e273088e2b3b6537be362dc7d004ba09d75602b85\"" May 13 00:25:14.669216 containerd[1455]: time="2025-05-13T00:25:14.669197437Z" level=info msg="StartContainer for \"56f2ee85bce65a43b916d04e273088e2b3b6537be362dc7d004ba09d75602b85\"" May 13 00:25:14.682480 containerd[1455]: time="2025-05-13T00:25:14.682378070Z" level=info msg="CreateContainer within sandbox \"53a6d1a6b53c2692a25aee5db458371b6451d9e2348943899276c91de34a6eed\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8e02d2c1ee4b087ef47d1929aa9a6437d6d9b612ac74f3eb7b1a52b6d5c9880a\"" May 13 00:25:14.682989 containerd[1455]: time="2025-05-13T00:25:14.682961955Z" level=info msg="StartContainer for \"8e02d2c1ee4b087ef47d1929aa9a6437d6d9b612ac74f3eb7b1a52b6d5c9880a\"" May 13 00:25:14.687186 containerd[1455]: time="2025-05-13T00:25:14.687133460Z" level=info msg="CreateContainer within sandbox \"26932c0cdc4ea545f45ca1f5c475ed39e72901354921a27d228bb35d1aa90de8\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"1e040df616ea73049e7dab09df93f856b1dc7172adf8ad41a3a90d652f1ff63c\"" May 13 00:25:14.688386 containerd[1455]: time="2025-05-13T00:25:14.688139026Z" level=info msg="StartContainer for \"1e040df616ea73049e7dab09df93f856b1dc7172adf8ad41a3a90d652f1ff63c\"" May 13 00:25:14.697878 systemd[1]: Started cri-containerd-56f2ee85bce65a43b916d04e273088e2b3b6537be362dc7d004ba09d75602b85.scope - libcontainer container 56f2ee85bce65a43b916d04e273088e2b3b6537be362dc7d004ba09d75602b85. May 13 00:25:14.711686 systemd[1]: Started cri-containerd-8e02d2c1ee4b087ef47d1929aa9a6437d6d9b612ac74f3eb7b1a52b6d5c9880a.scope - libcontainer container 8e02d2c1ee4b087ef47d1929aa9a6437d6d9b612ac74f3eb7b1a52b6d5c9880a. May 13 00:25:14.716679 systemd[1]: Started cri-containerd-1e040df616ea73049e7dab09df93f856b1dc7172adf8ad41a3a90d652f1ff63c.scope - libcontainer container 1e040df616ea73049e7dab09df93f856b1dc7172adf8ad41a3a90d652f1ff63c. May 13 00:25:14.743520 containerd[1455]: time="2025-05-13T00:25:14.743445906Z" level=info msg="StartContainer for \"56f2ee85bce65a43b916d04e273088e2b3b6537be362dc7d004ba09d75602b85\" returns successfully" May 13 00:25:14.757306 containerd[1455]: time="2025-05-13T00:25:14.757267141Z" level=info msg="StartContainer for \"8e02d2c1ee4b087ef47d1929aa9a6437d6d9b612ac74f3eb7b1a52b6d5c9880a\" returns successfully" May 13 00:25:14.757991 containerd[1455]: time="2025-05-13T00:25:14.757457498Z" level=info msg="StartContainer for \"1e040df616ea73049e7dab09df93f856b1dc7172adf8ad41a3a90d652f1ff63c\" returns successfully" May 13 00:25:14.776896 kubelet[2115]: I0513 00:25:14.776527 2115 kubelet_node_status.go:76] "Attempting to register node" node="localhost" May 13 00:25:14.777571 kubelet[2115]: E0513 00:25:14.777323 2115 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.0.0.89:6443/api/v1/nodes\": dial tcp 10.0.0.89:6443: connect: connection refused" node="localhost" May 13 00:25:14.876812 kubelet[2115]: E0513 00:25:14.876563 2115 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 13 00:25:14.876812 kubelet[2115]: E0513 00:25:14.876679 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:14.880070 kubelet[2115]: E0513 00:25:14.879909 2115 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 13 00:25:14.880070 kubelet[2115]: E0513 00:25:14.879998 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:14.882246 kubelet[2115]: E0513 00:25:14.882101 2115 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 13 00:25:14.882246 kubelet[2115]: E0513 00:25:14.882174 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:15.881884 kubelet[2115]: E0513 00:25:15.881844 2115 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" May 13 00:25:15.884001 kubelet[2115]: E0513 00:25:15.883980 2115 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 13 00:25:15.884133 kubelet[2115]: E0513 00:25:15.884108 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:15.884185 kubelet[2115]: E0513 00:25:15.884112 2115 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 13 00:25:15.884220 kubelet[2115]: E0513 00:25:15.884188 2115 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 13 00:25:15.884249 kubelet[2115]: E0513 00:25:15.884237 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:15.884331 kubelet[2115]: E0513 00:25:15.884315 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:15.960928 kubelet[2115]: E0513 00:25:15.960888 2115 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found May 13 00:25:16.303256 kubelet[2115]: E0513 00:25:16.303217 2115 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found May 13 00:25:16.379263 kubelet[2115]: I0513 00:25:16.379228 2115 kubelet_node_status.go:76] "Attempting to register node" node="localhost" May 13 00:25:16.385578 kubelet[2115]: I0513 00:25:16.385527 2115 kubelet_node_status.go:79] "Successfully registered node" node="localhost" May 13 00:25:16.385578 kubelet[2115]: E0513 00:25:16.385572 2115 kubelet_node_status.go:549] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" May 13 00:25:16.388278 kubelet[2115]: E0513 00:25:16.388260 2115 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 00:25:16.488895 kubelet[2115]: E0513 00:25:16.488841 2115 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 00:25:16.589641 kubelet[2115]: E0513 00:25:16.589511 2115 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 00:25:16.652618 kubelet[2115]: I0513 00:25:16.652568 2115 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 13 00:25:16.658612 kubelet[2115]: I0513 00:25:16.658590 2115 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" May 13 00:25:16.661975 kubelet[2115]: I0513 00:25:16.661942 2115 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" May 13 00:25:16.841216 kubelet[2115]: I0513 00:25:16.841084 2115 apiserver.go:52] "Watching apiserver" May 13 00:25:16.852796 kubelet[2115]: I0513 00:25:16.852768 2115 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 13 00:25:16.884286 kubelet[2115]: E0513 00:25:16.884258 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:16.884772 kubelet[2115]: E0513 00:25:16.884341 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:16.884772 kubelet[2115]: E0513 00:25:16.884594 2115 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:17.966176 systemd[1]: Reloading requested from client PID 2392 ('systemctl') (unit session-7.scope)... May 13 00:25:17.966202 systemd[1]: Reloading... May 13 00:25:18.047576 zram_generator::config[2433]: No configuration found. May 13 00:25:18.155942 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 00:25:18.247652 systemd[1]: Reloading finished in 280 ms. May 13 00:25:18.289607 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 00:25:18.305902 systemd[1]: kubelet.service: Deactivated successfully. May 13 00:25:18.306126 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 00:25:18.317744 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 00:25:18.474559 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 00:25:18.479948 (kubelet)[2476]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 00:25:18.517448 kubelet[2476]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 00:25:18.518286 kubelet[2476]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 13 00:25:18.518286 kubelet[2476]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 00:25:18.518286 kubelet[2476]: I0513 00:25:18.517868 2476 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 00:25:18.525172 kubelet[2476]: I0513 00:25:18.525142 2476 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" May 13 00:25:18.525172 kubelet[2476]: I0513 00:25:18.525164 2476 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 00:25:18.525404 kubelet[2476]: I0513 00:25:18.525387 2476 server.go:954] "Client rotation is on, will bootstrap in background" May 13 00:25:18.526435 kubelet[2476]: I0513 00:25:18.526418 2476 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 13 00:25:18.529123 kubelet[2476]: I0513 00:25:18.529080 2476 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 00:25:18.531649 kubelet[2476]: E0513 00:25:18.531592 2476 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" May 13 00:25:18.531649 kubelet[2476]: I0513 00:25:18.531629 2476 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." May 13 00:25:18.536854 kubelet[2476]: I0513 00:25:18.536822 2476 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 00:25:18.537090 kubelet[2476]: I0513 00:25:18.537066 2476 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 00:25:18.537260 kubelet[2476]: I0513 00:25:18.537092 2476 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 00:25:18.537343 kubelet[2476]: I0513 00:25:18.537264 2476 topology_manager.go:138] "Creating topology manager with none policy" May 13 00:25:18.537343 kubelet[2476]: I0513 00:25:18.537272 2476 container_manager_linux.go:304] "Creating device plugin manager" May 13 00:25:18.537343 kubelet[2476]: I0513 00:25:18.537320 2476 state_mem.go:36] "Initialized new in-memory state store" May 13 00:25:18.537486 kubelet[2476]: I0513 00:25:18.537470 2476 kubelet.go:446] "Attempting to sync node with API server" May 13 00:25:18.537511 kubelet[2476]: I0513 00:25:18.537487 2476 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 00:25:18.537531 kubelet[2476]: I0513 00:25:18.537514 2476 kubelet.go:352] "Adding apiserver pod source" May 13 00:25:18.537531 kubelet[2476]: I0513 00:25:18.537530 2476 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 00:25:18.538616 kubelet[2476]: I0513 00:25:18.538506 2476 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" May 13 00:25:18.539254 kubelet[2476]: I0513 00:25:18.539003 2476 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 00:25:18.539440 kubelet[2476]: I0513 00:25:18.539429 2476 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 13 00:25:18.539505 kubelet[2476]: I0513 00:25:18.539497 2476 server.go:1287] "Started kubelet" May 13 00:25:18.539768 kubelet[2476]: I0513 00:25:18.539714 2476 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 13 00:25:18.539882 kubelet[2476]: I0513 00:25:18.539849 2476 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 00:25:18.540145 kubelet[2476]: I0513 00:25:18.540132 2476 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 00:25:18.540810 kubelet[2476]: I0513 00:25:18.540789 2476 server.go:490] "Adding debug handlers to kubelet server" May 13 00:25:18.544592 kubelet[2476]: I0513 00:25:18.544560 2476 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 00:25:18.547826 kubelet[2476]: I0513 00:25:18.546705 2476 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 00:25:18.549259 kubelet[2476]: I0513 00:25:18.549239 2476 volume_manager.go:297] "Starting Kubelet Volume Manager" May 13 00:25:18.549578 kubelet[2476]: I0513 00:25:18.549567 2476 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 13 00:25:18.549754 kubelet[2476]: I0513 00:25:18.549744 2476 reconciler.go:26] "Reconciler: start to sync state" May 13 00:25:18.549885 kubelet[2476]: E0513 00:25:18.549862 2476 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 00:25:18.550805 kubelet[2476]: I0513 00:25:18.550055 2476 factory.go:221] Registration of the systemd container factory successfully May 13 00:25:18.550805 kubelet[2476]: I0513 00:25:18.550362 2476 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 00:25:18.551908 kubelet[2476]: I0513 00:25:18.551856 2476 factory.go:221] Registration of the containerd container factory successfully May 13 00:25:18.558400 kubelet[2476]: I0513 00:25:18.558364 2476 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 00:25:18.559977 kubelet[2476]: I0513 00:25:18.559959 2476 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 00:25:18.560028 kubelet[2476]: I0513 00:25:18.559980 2476 status_manager.go:227] "Starting to sync pod status with apiserver" May 13 00:25:18.560028 kubelet[2476]: I0513 00:25:18.560002 2476 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 13 00:25:18.560028 kubelet[2476]: I0513 00:25:18.560009 2476 kubelet.go:2388] "Starting kubelet main sync loop" May 13 00:25:18.560093 kubelet[2476]: E0513 00:25:18.560059 2476 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 00:25:18.581822 kubelet[2476]: I0513 00:25:18.581787 2476 cpu_manager.go:221] "Starting CPU manager" policy="none" May 13 00:25:18.581822 kubelet[2476]: I0513 00:25:18.581803 2476 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 13 00:25:18.581822 kubelet[2476]: I0513 00:25:18.581821 2476 state_mem.go:36] "Initialized new in-memory state store" May 13 00:25:18.581978 kubelet[2476]: I0513 00:25:18.581943 2476 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 13 00:25:18.581978 kubelet[2476]: I0513 00:25:18.581953 2476 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 13 00:25:18.581978 kubelet[2476]: I0513 00:25:18.581971 2476 policy_none.go:49] "None policy: Start" May 13 00:25:18.582035 kubelet[2476]: I0513 00:25:18.581985 2476 memory_manager.go:186] "Starting memorymanager" policy="None" May 13 00:25:18.582035 kubelet[2476]: I0513 00:25:18.581994 2476 state_mem.go:35] "Initializing new in-memory state store" May 13 00:25:18.582095 kubelet[2476]: I0513 00:25:18.582080 2476 state_mem.go:75] "Updated machine memory state" May 13 00:25:18.585659 kubelet[2476]: I0513 00:25:18.585641 2476 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 00:25:18.586184 kubelet[2476]: I0513 00:25:18.585795 2476 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 00:25:18.586184 kubelet[2476]: I0513 00:25:18.585808 2476 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 00:25:18.586184 kubelet[2476]: I0513 00:25:18.586044 2476 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 00:25:18.587053 kubelet[2476]: E0513 00:25:18.587030 2476 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 13 00:25:18.661348 kubelet[2476]: I0513 00:25:18.661317 2476 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" May 13 00:25:18.661481 kubelet[2476]: I0513 00:25:18.661410 2476 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 13 00:25:18.661481 kubelet[2476]: I0513 00:25:18.661433 2476 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" May 13 00:25:18.667059 kubelet[2476]: E0513 00:25:18.667004 2476 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" May 13 00:25:18.667362 kubelet[2476]: E0513 00:25:18.667313 2476 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" May 13 00:25:18.667402 kubelet[2476]: E0513 00:25:18.667365 2476 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 13 00:25:18.692586 kubelet[2476]: I0513 00:25:18.692553 2476 kubelet_node_status.go:76] "Attempting to register node" node="localhost" May 13 00:25:18.714173 kubelet[2476]: I0513 00:25:18.714137 2476 kubelet_node_status.go:125] "Node was previously registered" node="localhost" May 13 00:25:18.714315 kubelet[2476]: I0513 00:25:18.714222 2476 kubelet_node_status.go:79] "Successfully registered node" node="localhost" May 13 00:25:18.750974 kubelet[2476]: I0513 00:25:18.750944 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5386fe11ed933ab82453de11903c7f47-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"5386fe11ed933ab82453de11903c7f47\") " pod="kube-system/kube-controller-manager-localhost" May 13 00:25:18.750974 kubelet[2476]: I0513 00:25:18.750970 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5386fe11ed933ab82453de11903c7f47-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"5386fe11ed933ab82453de11903c7f47\") " pod="kube-system/kube-controller-manager-localhost" May 13 00:25:18.751065 kubelet[2476]: I0513 00:25:18.751036 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5386fe11ed933ab82453de11903c7f47-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"5386fe11ed933ab82453de11903c7f47\") " pod="kube-system/kube-controller-manager-localhost" May 13 00:25:18.751186 kubelet[2476]: I0513 00:25:18.751133 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5386fe11ed933ab82453de11903c7f47-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"5386fe11ed933ab82453de11903c7f47\") " pod="kube-system/kube-controller-manager-localhost" May 13 00:25:18.751186 kubelet[2476]: I0513 00:25:18.751177 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2980a8ab51edc665be10a02e33130e15-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"2980a8ab51edc665be10a02e33130e15\") " pod="kube-system/kube-scheduler-localhost" May 13 00:25:18.751186 kubelet[2476]: I0513 00:25:18.751198 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5386fe11ed933ab82453de11903c7f47-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"5386fe11ed933ab82453de11903c7f47\") " pod="kube-system/kube-controller-manager-localhost" May 13 00:25:18.751382 kubelet[2476]: I0513 00:25:18.751214 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/78458cefcff8c38d49901c8d241f4385-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"78458cefcff8c38d49901c8d241f4385\") " pod="kube-system/kube-apiserver-localhost" May 13 00:25:18.751382 kubelet[2476]: I0513 00:25:18.751245 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/78458cefcff8c38d49901c8d241f4385-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"78458cefcff8c38d49901c8d241f4385\") " pod="kube-system/kube-apiserver-localhost" May 13 00:25:18.751382 kubelet[2476]: I0513 00:25:18.751261 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/78458cefcff8c38d49901c8d241f4385-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"78458cefcff8c38d49901c8d241f4385\") " pod="kube-system/kube-apiserver-localhost" May 13 00:25:18.967782 kubelet[2476]: E0513 00:25:18.967562 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:18.967782 kubelet[2476]: E0513 00:25:18.967705 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:18.968061 kubelet[2476]: E0513 00:25:18.967704 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:19.538482 kubelet[2476]: I0513 00:25:19.538179 2476 apiserver.go:52] "Watching apiserver" May 13 00:25:19.550844 kubelet[2476]: I0513 00:25:19.550092 2476 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 13 00:25:19.576586 kubelet[2476]: I0513 00:25:19.573373 2476 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 13 00:25:19.576586 kubelet[2476]: E0513 00:25:19.573760 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:19.576901 kubelet[2476]: I0513 00:25:19.576886 2476 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" May 13 00:25:19.586293 kubelet[2476]: E0513 00:25:19.586255 2476 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" May 13 00:25:19.586578 kubelet[2476]: E0513 00:25:19.586564 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:19.599993 kubelet[2476]: E0513 00:25:19.599948 2476 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" May 13 00:25:19.600160 kubelet[2476]: E0513 00:25:19.600138 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:19.620945 kubelet[2476]: I0513 00:25:19.620845 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=3.620820947 podStartE2EDuration="3.620820947s" podCreationTimestamp="2025-05-13 00:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 00:25:19.610805281 +0000 UTC m=+1.126385128" watchObservedRunningTime="2025-05-13 00:25:19.620820947 +0000 UTC m=+1.136400795" May 13 00:25:19.632809 kubelet[2476]: I0513 00:25:19.632743 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.632723012 podStartE2EDuration="3.632723012s" podCreationTimestamp="2025-05-13 00:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 00:25:19.621075675 +0000 UTC m=+1.136655522" watchObservedRunningTime="2025-05-13 00:25:19.632723012 +0000 UTC m=+1.148302849" May 13 00:25:19.642723 kubelet[2476]: I0513 00:25:19.642623 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.642606641 podStartE2EDuration="3.642606641s" podCreationTimestamp="2025-05-13 00:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 00:25:19.63358035 +0000 UTC m=+1.149160197" watchObservedRunningTime="2025-05-13 00:25:19.642606641 +0000 UTC m=+1.158186488" May 13 00:25:20.574378 kubelet[2476]: E0513 00:25:20.574334 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:20.574834 kubelet[2476]: E0513 00:25:20.574779 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:20.574998 kubelet[2476]: E0513 00:25:20.574973 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:21.575357 kubelet[2476]: E0513 00:25:21.575330 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:21.575357 kubelet[2476]: E0513 00:25:21.575331 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:21.850387 kubelet[2476]: E0513 00:25:21.850265 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:23.156549 sudo[1633]: pam_unix(sudo:session): session closed for user root May 13 00:25:23.158651 sshd[1629]: pam_unix(sshd:session): session closed for user core May 13 00:25:23.161919 systemd[1]: sshd@6-10.0.0.89:22-10.0.0.1:53830.service: Deactivated successfully. May 13 00:25:23.165853 systemd[1]: session-7.scope: Deactivated successfully. May 13 00:25:23.166243 systemd[1]: session-7.scope: Consumed 3.828s CPU time, 158.8M memory peak, 0B memory swap peak. May 13 00:25:23.167364 systemd-logind[1436]: Session 7 logged out. Waiting for processes to exit. May 13 00:25:23.169596 systemd-logind[1436]: Removed session 7. May 13 00:25:23.226896 kubelet[2476]: I0513 00:25:23.226501 2476 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 13 00:25:23.227320 containerd[1455]: time="2025-05-13T00:25:23.226823565Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 13 00:25:23.228520 kubelet[2476]: I0513 00:25:23.227854 2476 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 13 00:25:23.487757 kubelet[2476]: E0513 00:25:23.487631 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:24.106710 systemd[1]: Created slice kubepods-besteffort-pod860c5d0d_1a57_4e6a_82ff_8913f873596a.slice - libcontainer container kubepods-besteffort-pod860c5d0d_1a57_4e6a_82ff_8913f873596a.slice. May 13 00:25:24.184561 kubelet[2476]: I0513 00:25:24.184463 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/860c5d0d-1a57-4e6a-82ff-8913f873596a-kube-proxy\") pod \"kube-proxy-k5qsr\" (UID: \"860c5d0d-1a57-4e6a-82ff-8913f873596a\") " pod="kube-system/kube-proxy-k5qsr" May 13 00:25:24.184561 kubelet[2476]: I0513 00:25:24.184503 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/860c5d0d-1a57-4e6a-82ff-8913f873596a-xtables-lock\") pod \"kube-proxy-k5qsr\" (UID: \"860c5d0d-1a57-4e6a-82ff-8913f873596a\") " pod="kube-system/kube-proxy-k5qsr" May 13 00:25:24.184561 kubelet[2476]: I0513 00:25:24.184518 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/860c5d0d-1a57-4e6a-82ff-8913f873596a-lib-modules\") pod \"kube-proxy-k5qsr\" (UID: \"860c5d0d-1a57-4e6a-82ff-8913f873596a\") " pod="kube-system/kube-proxy-k5qsr" May 13 00:25:24.184561 kubelet[2476]: I0513 00:25:24.184546 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v992f\" (UniqueName: \"kubernetes.io/projected/860c5d0d-1a57-4e6a-82ff-8913f873596a-kube-api-access-v992f\") pod \"kube-proxy-k5qsr\" (UID: \"860c5d0d-1a57-4e6a-82ff-8913f873596a\") " pod="kube-system/kube-proxy-k5qsr" May 13 00:25:24.418136 kubelet[2476]: E0513 00:25:24.418009 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:24.418573 containerd[1455]: time="2025-05-13T00:25:24.418510455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-k5qsr,Uid:860c5d0d-1a57-4e6a-82ff-8913f873596a,Namespace:kube-system,Attempt:0,}" May 13 00:25:24.606819 systemd[1]: Created slice kubepods-besteffort-podd0f9fd61_2ed3_4fac_8c18_a0aa1f41cb72.slice - libcontainer container kubepods-besteffort-podd0f9fd61_2ed3_4fac_8c18_a0aa1f41cb72.slice. May 13 00:25:24.610672 containerd[1455]: time="2025-05-13T00:25:24.610599965Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:25:24.610960 containerd[1455]: time="2025-05-13T00:25:24.610655943Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:25:24.611444 containerd[1455]: time="2025-05-13T00:25:24.611406706Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:24.611655 containerd[1455]: time="2025-05-13T00:25:24.611598185Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:24.632743 systemd[1]: Started cri-containerd-e2bf07e1212ff46e0894db93d6de523bc3574b99ecff16d344542009fcd60652.scope - libcontainer container e2bf07e1212ff46e0894db93d6de523bc3574b99ecff16d344542009fcd60652. May 13 00:25:24.652183 containerd[1455]: time="2025-05-13T00:25:24.652133682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-k5qsr,Uid:860c5d0d-1a57-4e6a-82ff-8913f873596a,Namespace:kube-system,Attempt:0,} returns sandbox id \"e2bf07e1212ff46e0894db93d6de523bc3574b99ecff16d344542009fcd60652\"" May 13 00:25:24.652700 kubelet[2476]: E0513 00:25:24.652680 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:24.654653 containerd[1455]: time="2025-05-13T00:25:24.654618640Z" level=info msg="CreateContainer within sandbox \"e2bf07e1212ff46e0894db93d6de523bc3574b99ecff16d344542009fcd60652\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 13 00:25:24.672431 containerd[1455]: time="2025-05-13T00:25:24.672338189Z" level=info msg="CreateContainer within sandbox \"e2bf07e1212ff46e0894db93d6de523bc3574b99ecff16d344542009fcd60652\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"46f0bcd14c9a040c55d47ab49ee16fb1f9d6949569e8a493f87fddf7e4f53339\"" May 13 00:25:24.672851 containerd[1455]: time="2025-05-13T00:25:24.672832520Z" level=info msg="StartContainer for \"46f0bcd14c9a040c55d47ab49ee16fb1f9d6949569e8a493f87fddf7e4f53339\"" May 13 00:25:24.687788 kubelet[2476]: I0513 00:25:24.687741 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d0f9fd61-2ed3-4fac-8c18-a0aa1f41cb72-var-lib-calico\") pod \"tigera-operator-789496d6f5-jw72c\" (UID: \"d0f9fd61-2ed3-4fac-8c18-a0aa1f41cb72\") " pod="tigera-operator/tigera-operator-789496d6f5-jw72c" May 13 00:25:24.687788 kubelet[2476]: I0513 00:25:24.687782 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sznvg\" (UniqueName: \"kubernetes.io/projected/d0f9fd61-2ed3-4fac-8c18-a0aa1f41cb72-kube-api-access-sznvg\") pod \"tigera-operator-789496d6f5-jw72c\" (UID: \"d0f9fd61-2ed3-4fac-8c18-a0aa1f41cb72\") " pod="tigera-operator/tigera-operator-789496d6f5-jw72c" May 13 00:25:24.707691 systemd[1]: Started cri-containerd-46f0bcd14c9a040c55d47ab49ee16fb1f9d6949569e8a493f87fddf7e4f53339.scope - libcontainer container 46f0bcd14c9a040c55d47ab49ee16fb1f9d6949569e8a493f87fddf7e4f53339. May 13 00:25:24.747507 containerd[1455]: time="2025-05-13T00:25:24.747458743Z" level=info msg="StartContainer for \"46f0bcd14c9a040c55d47ab49ee16fb1f9d6949569e8a493f87fddf7e4f53339\" returns successfully" May 13 00:25:24.910461 containerd[1455]: time="2025-05-13T00:25:24.910424436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-789496d6f5-jw72c,Uid:d0f9fd61-2ed3-4fac-8c18-a0aa1f41cb72,Namespace:tigera-operator,Attempt:0,}" May 13 00:25:24.933288 containerd[1455]: time="2025-05-13T00:25:24.933117277Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:25:24.933288 containerd[1455]: time="2025-05-13T00:25:24.933173696Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:25:24.933288 containerd[1455]: time="2025-05-13T00:25:24.933184908Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:24.933590 containerd[1455]: time="2025-05-13T00:25:24.933278568Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:24.951670 systemd[1]: Started cri-containerd-c50f1efa90641fbaac582596c9b27475535fa7404c5a8bf85b00990c4e280563.scope - libcontainer container c50f1efa90641fbaac582596c9b27475535fa7404c5a8bf85b00990c4e280563. May 13 00:25:24.986314 containerd[1455]: time="2025-05-13T00:25:24.986279042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-789496d6f5-jw72c,Uid:d0f9fd61-2ed3-4fac-8c18-a0aa1f41cb72,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c50f1efa90641fbaac582596c9b27475535fa7404c5a8bf85b00990c4e280563\"" May 13 00:25:24.988019 containerd[1455]: time="2025-05-13T00:25:24.987985092Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 13 00:25:25.582620 kubelet[2476]: E0513 00:25:25.582396 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:25.589880 kubelet[2476]: I0513 00:25:25.589643 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-k5qsr" podStartSLOduration=1.5896329489999999 podStartE2EDuration="1.589632949s" podCreationTimestamp="2025-05-13 00:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 00:25:25.589257358 +0000 UTC m=+7.104837205" watchObservedRunningTime="2025-05-13 00:25:25.589632949 +0000 UTC m=+7.105212796" May 13 00:25:26.627579 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount223939650.mount: Deactivated successfully. May 13 00:25:28.186995 containerd[1455]: time="2025-05-13T00:25:28.186938275Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:28.187757 containerd[1455]: time="2025-05-13T00:25:28.187699361Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 13 00:25:28.190552 containerd[1455]: time="2025-05-13T00:25:28.189611126Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:28.192559 containerd[1455]: time="2025-05-13T00:25:28.192498627Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:28.193489 containerd[1455]: time="2025-05-13T00:25:28.193454725Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 3.205432973s" May 13 00:25:28.193559 containerd[1455]: time="2025-05-13T00:25:28.193491836Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 13 00:25:28.195612 containerd[1455]: time="2025-05-13T00:25:28.195577594Z" level=info msg="CreateContainer within sandbox \"c50f1efa90641fbaac582596c9b27475535fa7404c5a8bf85b00990c4e280563\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 13 00:25:28.209130 containerd[1455]: time="2025-05-13T00:25:28.209091062Z" level=info msg="CreateContainer within sandbox \"c50f1efa90641fbaac582596c9b27475535fa7404c5a8bf85b00990c4e280563\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8364a65a7cc2d9be8be9cadde64f8d4952d98bc451938928b17a5fcd011067f3\"" May 13 00:25:28.209629 containerd[1455]: time="2025-05-13T00:25:28.209599624Z" level=info msg="StartContainer for \"8364a65a7cc2d9be8be9cadde64f8d4952d98bc451938928b17a5fcd011067f3\"" May 13 00:25:28.240655 systemd[1]: Started cri-containerd-8364a65a7cc2d9be8be9cadde64f8d4952d98bc451938928b17a5fcd011067f3.scope - libcontainer container 8364a65a7cc2d9be8be9cadde64f8d4952d98bc451938928b17a5fcd011067f3. May 13 00:25:28.264913 containerd[1455]: time="2025-05-13T00:25:28.264872396Z" level=info msg="StartContainer for \"8364a65a7cc2d9be8be9cadde64f8d4952d98bc451938928b17a5fcd011067f3\" returns successfully" May 13 00:25:31.040234 kubelet[2476]: E0513 00:25:31.040158 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:31.082185 kubelet[2476]: I0513 00:25:31.082111 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-789496d6f5-jw72c" podStartSLOduration=3.875184549 podStartE2EDuration="7.082092449s" podCreationTimestamp="2025-05-13 00:25:24 +0000 UTC" firstStartedPulling="2025-05-13 00:25:24.987350531 +0000 UTC m=+6.502930378" lastFinishedPulling="2025-05-13 00:25:28.194258431 +0000 UTC m=+9.709838278" observedRunningTime="2025-05-13 00:25:28.595982831 +0000 UTC m=+10.111562678" watchObservedRunningTime="2025-05-13 00:25:31.082092449 +0000 UTC m=+12.597672296" May 13 00:25:31.282645 systemd[1]: Created slice kubepods-besteffort-pod17d1d7ef_68e6_4526_90e9_cb7aee244b81.slice - libcontainer container kubepods-besteffort-pod17d1d7ef_68e6_4526_90e9_cb7aee244b81.slice. May 13 00:25:31.335957 kubelet[2476]: I0513 00:25:31.335831 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6da78090-2c51-4b10-b9d4-8db249fc4886-tigera-ca-bundle\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.335957 kubelet[2476]: I0513 00:25:31.335875 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/17d1d7ef-68e6-4526-90e9-cb7aee244b81-typha-certs\") pod \"calico-typha-787554bc4-8k88b\" (UID: \"17d1d7ef-68e6-4526-90e9-cb7aee244b81\") " pod="calico-system/calico-typha-787554bc4-8k88b" May 13 00:25:31.335957 kubelet[2476]: I0513 00:25:31.335908 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6da78090-2c51-4b10-b9d4-8db249fc4886-cni-net-dir\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.335957 kubelet[2476]: I0513 00:25:31.335924 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17d1d7ef-68e6-4526-90e9-cb7aee244b81-tigera-ca-bundle\") pod \"calico-typha-787554bc4-8k88b\" (UID: \"17d1d7ef-68e6-4526-90e9-cb7aee244b81\") " pod="calico-system/calico-typha-787554bc4-8k88b" May 13 00:25:31.335957 kubelet[2476]: I0513 00:25:31.335942 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6da78090-2c51-4b10-b9d4-8db249fc4886-flexvol-driver-host\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.336166 kubelet[2476]: I0513 00:25:31.335956 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64m4c\" (UniqueName: \"kubernetes.io/projected/17d1d7ef-68e6-4526-90e9-cb7aee244b81-kube-api-access-64m4c\") pod \"calico-typha-787554bc4-8k88b\" (UID: \"17d1d7ef-68e6-4526-90e9-cb7aee244b81\") " pod="calico-system/calico-typha-787554bc4-8k88b" May 13 00:25:31.336166 kubelet[2476]: I0513 00:25:31.335972 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6da78090-2c51-4b10-b9d4-8db249fc4886-node-certs\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.336166 kubelet[2476]: I0513 00:25:31.335987 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6da78090-2c51-4b10-b9d4-8db249fc4886-var-run-calico\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.336166 kubelet[2476]: I0513 00:25:31.336005 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6da78090-2c51-4b10-b9d4-8db249fc4886-policysync\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.336166 kubelet[2476]: I0513 00:25:31.336018 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6da78090-2c51-4b10-b9d4-8db249fc4886-lib-modules\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.336163 systemd[1]: Created slice kubepods-besteffort-pod6da78090_2c51_4b10_b9d4_8db249fc4886.slice - libcontainer container kubepods-besteffort-pod6da78090_2c51_4b10_b9d4_8db249fc4886.slice. May 13 00:25:31.336391 kubelet[2476]: I0513 00:25:31.336031 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6da78090-2c51-4b10-b9d4-8db249fc4886-xtables-lock\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.336391 kubelet[2476]: I0513 00:25:31.336047 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6da78090-2c51-4b10-b9d4-8db249fc4886-cni-bin-dir\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.336391 kubelet[2476]: I0513 00:25:31.336061 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9sfd\" (UniqueName: \"kubernetes.io/projected/6da78090-2c51-4b10-b9d4-8db249fc4886-kube-api-access-k9sfd\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.336391 kubelet[2476]: I0513 00:25:31.336077 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6da78090-2c51-4b10-b9d4-8db249fc4886-var-lib-calico\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.336391 kubelet[2476]: I0513 00:25:31.336091 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6da78090-2c51-4b10-b9d4-8db249fc4886-cni-log-dir\") pod \"calico-node-lj45c\" (UID: \"6da78090-2c51-4b10-b9d4-8db249fc4886\") " pod="calico-system/calico-node-lj45c" May 13 00:25:31.439631 kubelet[2476]: E0513 00:25:31.438277 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.439631 kubelet[2476]: W0513 00:25:31.438308 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.439631 kubelet[2476]: E0513 00:25:31.438348 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.439631 kubelet[2476]: E0513 00:25:31.438591 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.439631 kubelet[2476]: W0513 00:25:31.438600 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.439631 kubelet[2476]: E0513 00:25:31.438622 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.439631 kubelet[2476]: E0513 00:25:31.438924 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.439631 kubelet[2476]: W0513 00:25:31.438933 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.439631 kubelet[2476]: E0513 00:25:31.438956 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.439631 kubelet[2476]: E0513 00:25:31.439221 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.439954 kubelet[2476]: W0513 00:25:31.439228 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.439954 kubelet[2476]: E0513 00:25:31.439307 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.439954 kubelet[2476]: E0513 00:25:31.439652 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.439954 kubelet[2476]: W0513 00:25:31.439662 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.439954 kubelet[2476]: E0513 00:25:31.439760 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.440380 kubelet[2476]: E0513 00:25:31.440352 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.440380 kubelet[2476]: W0513 00:25:31.440369 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.440531 kubelet[2476]: E0513 00:25:31.440449 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.440658 kubelet[2476]: E0513 00:25:31.440632 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.440658 kubelet[2476]: W0513 00:25:31.440647 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.440773 kubelet[2476]: E0513 00:25:31.440752 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.440953 kubelet[2476]: E0513 00:25:31.440909 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xgblg" podUID="93974201-6dc9-4f10-8077-3a1e2417dccb" May 13 00:25:31.441627 kubelet[2476]: E0513 00:25:31.441600 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.441627 kubelet[2476]: W0513 00:25:31.441619 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.441700 kubelet[2476]: E0513 00:25:31.441673 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.441884 kubelet[2476]: E0513 00:25:31.441848 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.441884 kubelet[2476]: W0513 00:25:31.441865 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.441942 kubelet[2476]: E0513 00:25:31.441913 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.442581 kubelet[2476]: E0513 00:25:31.442505 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.442581 kubelet[2476]: W0513 00:25:31.442518 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.442834 kubelet[2476]: E0513 00:25:31.442766 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.442834 kubelet[2476]: W0513 00:25:31.442784 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.442834 kubelet[2476]: E0513 00:25:31.442829 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.443033 kubelet[2476]: E0513 00:25:31.443001 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.443175 kubelet[2476]: E0513 00:25:31.443128 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.443175 kubelet[2476]: W0513 00:25:31.443136 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.443228 kubelet[2476]: E0513 00:25:31.443179 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.443524 kubelet[2476]: E0513 00:25:31.443482 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.443524 kubelet[2476]: W0513 00:25:31.443498 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.443659 kubelet[2476]: E0513 00:25:31.443560 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.443871 kubelet[2476]: E0513 00:25:31.443725 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.443871 kubelet[2476]: W0513 00:25:31.443737 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.443871 kubelet[2476]: E0513 00:25:31.443778 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.443951 kubelet[2476]: E0513 00:25:31.443926 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.443951 kubelet[2476]: W0513 00:25:31.443934 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.443994 kubelet[2476]: E0513 00:25:31.443980 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.444237 kubelet[2476]: E0513 00:25:31.444144 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.444237 kubelet[2476]: W0513 00:25:31.444156 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.444237 kubelet[2476]: E0513 00:25:31.444201 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.444559 kubelet[2476]: E0513 00:25:31.444374 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.444559 kubelet[2476]: W0513 00:25:31.444382 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.444559 kubelet[2476]: E0513 00:25:31.444427 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.444744 kubelet[2476]: E0513 00:25:31.444632 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.444744 kubelet[2476]: W0513 00:25:31.444641 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.444744 kubelet[2476]: E0513 00:25:31.444683 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.445050 kubelet[2476]: E0513 00:25:31.444846 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.445050 kubelet[2476]: W0513 00:25:31.444855 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.445050 kubelet[2476]: E0513 00:25:31.444906 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.445218 kubelet[2476]: E0513 00:25:31.445077 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.445218 kubelet[2476]: W0513 00:25:31.445085 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.445218 kubelet[2476]: E0513 00:25:31.445132 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.445466 kubelet[2476]: E0513 00:25:31.445290 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.445466 kubelet[2476]: W0513 00:25:31.445303 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.445466 kubelet[2476]: E0513 00:25:31.445380 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.445650 kubelet[2476]: E0513 00:25:31.445589 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.445650 kubelet[2476]: W0513 00:25:31.445597 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.445650 kubelet[2476]: E0513 00:25:31.445646 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.446007 kubelet[2476]: E0513 00:25:31.445807 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.446007 kubelet[2476]: W0513 00:25:31.445815 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.446007 kubelet[2476]: E0513 00:25:31.445861 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.446089 kubelet[2476]: E0513 00:25:31.446023 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.446089 kubelet[2476]: W0513 00:25:31.446030 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.446089 kubelet[2476]: E0513 00:25:31.446077 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.446492 kubelet[2476]: E0513 00:25:31.446235 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.446492 kubelet[2476]: W0513 00:25:31.446247 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.446492 kubelet[2476]: E0513 00:25:31.446292 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.446492 kubelet[2476]: E0513 00:25:31.446462 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.446492 kubelet[2476]: W0513 00:25:31.446469 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.446827 kubelet[2476]: E0513 00:25:31.446518 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.446827 kubelet[2476]: E0513 00:25:31.446721 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.446827 kubelet[2476]: W0513 00:25:31.446729 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.446827 kubelet[2476]: E0513 00:25:31.446811 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.447342 kubelet[2476]: E0513 00:25:31.447295 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.448201 kubelet[2476]: W0513 00:25:31.448164 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.448369 kubelet[2476]: E0513 00:25:31.448317 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.449900 kubelet[2476]: E0513 00:25:31.449814 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.449900 kubelet[2476]: W0513 00:25:31.449828 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.449994 kubelet[2476]: E0513 00:25:31.449979 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.450272 kubelet[2476]: E0513 00:25:31.450261 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.450434 kubelet[2476]: W0513 00:25:31.450348 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.450528 kubelet[2476]: E0513 00:25:31.450515 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.450757 kubelet[2476]: E0513 00:25:31.450745 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.450828 kubelet[2476]: W0513 00:25:31.450816 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.451215 kubelet[2476]: E0513 00:25:31.450935 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.452108 kubelet[2476]: E0513 00:25:31.452083 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.452108 kubelet[2476]: W0513 00:25:31.452097 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.452379 kubelet[2476]: E0513 00:25:31.452340 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.453528 kubelet[2476]: E0513 00:25:31.453454 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.453528 kubelet[2476]: W0513 00:25:31.453471 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.453970 kubelet[2476]: E0513 00:25:31.453714 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.455675 kubelet[2476]: E0513 00:25:31.455586 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.455675 kubelet[2476]: W0513 00:25:31.455670 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.455788 kubelet[2476]: E0513 00:25:31.455761 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.455965 kubelet[2476]: E0513 00:25:31.455941 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.455965 kubelet[2476]: W0513 00:25:31.455957 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.456185 kubelet[2476]: E0513 00:25:31.456163 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.456185 kubelet[2476]: W0513 00:25:31.456179 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.456185 kubelet[2476]: E0513 00:25:31.456186 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.456316 kubelet[2476]: E0513 00:25:31.456209 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.457057 kubelet[2476]: E0513 00:25:31.456649 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.457057 kubelet[2476]: W0513 00:25:31.457014 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.457293 kubelet[2476]: E0513 00:25:31.457261 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.457326 kubelet[2476]: E0513 00:25:31.457314 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.457372 kubelet[2476]: W0513 00:25:31.457337 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.457447 kubelet[2476]: E0513 00:25:31.457428 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.458548 kubelet[2476]: E0513 00:25:31.458345 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.458548 kubelet[2476]: W0513 00:25:31.458468 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.458647 kubelet[2476]: E0513 00:25:31.458634 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.459153 kubelet[2476]: E0513 00:25:31.458776 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.459153 kubelet[2476]: W0513 00:25:31.458787 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.459153 kubelet[2476]: E0513 00:25:31.458825 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.459243 kubelet[2476]: E0513 00:25:31.459173 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.459243 kubelet[2476]: W0513 00:25:31.459181 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.459325 kubelet[2476]: E0513 00:25:31.459297 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.459427 kubelet[2476]: E0513 00:25:31.459412 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.459427 kubelet[2476]: W0513 00:25:31.459424 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.459668 kubelet[2476]: E0513 00:25:31.459648 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.460284 kubelet[2476]: E0513 00:25:31.460083 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.460284 kubelet[2476]: W0513 00:25:31.460098 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.460367 kubelet[2476]: E0513 00:25:31.460304 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.460367 kubelet[2476]: W0513 00:25:31.460311 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.460367 kubelet[2476]: E0513 00:25:31.460350 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.460444 kubelet[2476]: E0513 00:25:31.460372 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.460561 kubelet[2476]: E0513 00:25:31.460517 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.460561 kubelet[2476]: W0513 00:25:31.460528 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.460800 kubelet[2476]: E0513 00:25:31.460701 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.460800 kubelet[2476]: W0513 00:25:31.460712 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.461410 kubelet[2476]: E0513 00:25:31.460921 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.461410 kubelet[2476]: E0513 00:25:31.460945 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.461668 kubelet[2476]: E0513 00:25:31.461643 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.461668 kubelet[2476]: W0513 00:25:31.461661 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.461786 kubelet[2476]: E0513 00:25:31.461757 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.463063 kubelet[2476]: E0513 00:25:31.463037 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.463063 kubelet[2476]: W0513 00:25:31.463053 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.463303 kubelet[2476]: E0513 00:25:31.463161 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.464233 kubelet[2476]: E0513 00:25:31.463594 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.464233 kubelet[2476]: W0513 00:25:31.463608 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.464233 kubelet[2476]: E0513 00:25:31.463729 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.464390 kubelet[2476]: E0513 00:25:31.464363 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.464390 kubelet[2476]: W0513 00:25:31.464383 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.464439 kubelet[2476]: E0513 00:25:31.464425 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.464862 kubelet[2476]: E0513 00:25:31.464836 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.464862 kubelet[2476]: W0513 00:25:31.464856 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.464962 kubelet[2476]: E0513 00:25:31.464940 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.465394 kubelet[2476]: E0513 00:25:31.465272 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.465394 kubelet[2476]: W0513 00:25:31.465283 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.465394 kubelet[2476]: E0513 00:25:31.465325 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.465648 kubelet[2476]: E0513 00:25:31.465637 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.465728 kubelet[2476]: W0513 00:25:31.465696 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.465728 kubelet[2476]: E0513 00:25:31.465709 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.475753 kubelet[2476]: E0513 00:25:31.475727 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.475932 kubelet[2476]: W0513 00:25:31.475877 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.475932 kubelet[2476]: E0513 00:25:31.475900 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.529077 kubelet[2476]: E0513 00:25:31.529047 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.529077 kubelet[2476]: W0513 00:25:31.529065 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.529077 kubelet[2476]: E0513 00:25:31.529083 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.529362 kubelet[2476]: E0513 00:25:31.529342 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.529507 kubelet[2476]: W0513 00:25:31.529478 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.529507 kubelet[2476]: E0513 00:25:31.529494 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.529848 kubelet[2476]: E0513 00:25:31.529818 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.529902 kubelet[2476]: W0513 00:25:31.529846 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.529902 kubelet[2476]: E0513 00:25:31.529874 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.530186 kubelet[2476]: E0513 00:25:31.530168 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.530186 kubelet[2476]: W0513 00:25:31.530186 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.530255 kubelet[2476]: E0513 00:25:31.530197 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.530417 kubelet[2476]: E0513 00:25:31.530400 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.530417 kubelet[2476]: W0513 00:25:31.530411 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.530492 kubelet[2476]: E0513 00:25:31.530420 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.530706 kubelet[2476]: E0513 00:25:31.530672 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.530706 kubelet[2476]: W0513 00:25:31.530699 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.530946 kubelet[2476]: E0513 00:25:31.530726 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.531182 kubelet[2476]: E0513 00:25:31.531153 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.531182 kubelet[2476]: W0513 00:25:31.531174 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.531238 kubelet[2476]: E0513 00:25:31.531186 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.531459 kubelet[2476]: E0513 00:25:31.531442 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.531459 kubelet[2476]: W0513 00:25:31.531456 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.531519 kubelet[2476]: E0513 00:25:31.531467 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.531800 kubelet[2476]: E0513 00:25:31.531779 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.531800 kubelet[2476]: W0513 00:25:31.531795 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.531870 kubelet[2476]: E0513 00:25:31.531807 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.532099 kubelet[2476]: E0513 00:25:31.532080 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.532099 kubelet[2476]: W0513 00:25:31.532093 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.532166 kubelet[2476]: E0513 00:25:31.532106 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.532323 kubelet[2476]: E0513 00:25:31.532308 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.532323 kubelet[2476]: W0513 00:25:31.532318 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.532382 kubelet[2476]: E0513 00:25:31.532335 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.532614 kubelet[2476]: E0513 00:25:31.532600 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.532614 kubelet[2476]: W0513 00:25:31.532611 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.532683 kubelet[2476]: E0513 00:25:31.532620 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.532846 kubelet[2476]: E0513 00:25:31.532832 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.532846 kubelet[2476]: W0513 00:25:31.532842 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.532924 kubelet[2476]: E0513 00:25:31.532851 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.533322 kubelet[2476]: E0513 00:25:31.533297 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.533322 kubelet[2476]: W0513 00:25:31.533310 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.533322 kubelet[2476]: E0513 00:25:31.533320 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.533575 kubelet[2476]: E0513 00:25:31.533559 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.533575 kubelet[2476]: W0513 00:25:31.533572 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.533624 kubelet[2476]: E0513 00:25:31.533582 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.533849 kubelet[2476]: E0513 00:25:31.533789 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.533849 kubelet[2476]: W0513 00:25:31.533802 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.533849 kubelet[2476]: E0513 00:25:31.533810 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.534174 kubelet[2476]: E0513 00:25:31.534061 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.534174 kubelet[2476]: W0513 00:25:31.534076 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.534174 kubelet[2476]: E0513 00:25:31.534087 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.534302 kubelet[2476]: E0513 00:25:31.534286 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.534302 kubelet[2476]: W0513 00:25:31.534299 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.534355 kubelet[2476]: E0513 00:25:31.534310 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.534551 kubelet[2476]: E0513 00:25:31.534527 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.534580 kubelet[2476]: W0513 00:25:31.534552 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.534580 kubelet[2476]: E0513 00:25:31.534561 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.534829 kubelet[2476]: E0513 00:25:31.534816 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.534979 kubelet[2476]: W0513 00:25:31.534872 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.534979 kubelet[2476]: E0513 00:25:31.534886 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.538207 kubelet[2476]: E0513 00:25:31.538154 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.538207 kubelet[2476]: W0513 00:25:31.538203 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.538283 kubelet[2476]: E0513 00:25:31.538215 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.538283 kubelet[2476]: I0513 00:25:31.538246 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhtsd\" (UniqueName: \"kubernetes.io/projected/93974201-6dc9-4f10-8077-3a1e2417dccb-kube-api-access-qhtsd\") pod \"csi-node-driver-xgblg\" (UID: \"93974201-6dc9-4f10-8077-3a1e2417dccb\") " pod="calico-system/csi-node-driver-xgblg" May 13 00:25:31.538503 kubelet[2476]: E0513 00:25:31.538487 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.538503 kubelet[2476]: W0513 00:25:31.538499 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.538503 kubelet[2476]: E0513 00:25:31.538514 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.538612 kubelet[2476]: I0513 00:25:31.538528 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/93974201-6dc9-4f10-8077-3a1e2417dccb-registration-dir\") pod \"csi-node-driver-xgblg\" (UID: \"93974201-6dc9-4f10-8077-3a1e2417dccb\") " pod="calico-system/csi-node-driver-xgblg" May 13 00:25:31.538846 kubelet[2476]: E0513 00:25:31.538802 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.538846 kubelet[2476]: W0513 00:25:31.538816 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.538846 kubelet[2476]: E0513 00:25:31.538832 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.539057 kubelet[2476]: E0513 00:25:31.539039 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.539057 kubelet[2476]: W0513 00:25:31.539053 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.539129 kubelet[2476]: E0513 00:25:31.539067 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.539353 kubelet[2476]: E0513 00:25:31.539324 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.539353 kubelet[2476]: W0513 00:25:31.539345 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.539454 kubelet[2476]: E0513 00:25:31.539359 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.539454 kubelet[2476]: I0513 00:25:31.539383 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/93974201-6dc9-4f10-8077-3a1e2417dccb-socket-dir\") pod \"csi-node-driver-xgblg\" (UID: \"93974201-6dc9-4f10-8077-3a1e2417dccb\") " pod="calico-system/csi-node-driver-xgblg" May 13 00:25:31.539630 kubelet[2476]: E0513 00:25:31.539611 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.539630 kubelet[2476]: W0513 00:25:31.539625 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.539733 kubelet[2476]: E0513 00:25:31.539640 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.539733 kubelet[2476]: I0513 00:25:31.539653 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/93974201-6dc9-4f10-8077-3a1e2417dccb-varrun\") pod \"csi-node-driver-xgblg\" (UID: \"93974201-6dc9-4f10-8077-3a1e2417dccb\") " pod="calico-system/csi-node-driver-xgblg" May 13 00:25:31.539974 kubelet[2476]: E0513 00:25:31.539920 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.539974 kubelet[2476]: W0513 00:25:31.539935 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.539974 kubelet[2476]: E0513 00:25:31.539951 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.540182 kubelet[2476]: E0513 00:25:31.540169 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.540222 kubelet[2476]: W0513 00:25:31.540181 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.540222 kubelet[2476]: E0513 00:25:31.540198 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.540436 kubelet[2476]: E0513 00:25:31.540418 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.540436 kubelet[2476]: W0513 00:25:31.540433 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.540489 kubelet[2476]: E0513 00:25:31.540451 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.540489 kubelet[2476]: I0513 00:25:31.540470 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93974201-6dc9-4f10-8077-3a1e2417dccb-kubelet-dir\") pod \"csi-node-driver-xgblg\" (UID: \"93974201-6dc9-4f10-8077-3a1e2417dccb\") " pod="calico-system/csi-node-driver-xgblg" May 13 00:25:31.540741 kubelet[2476]: E0513 00:25:31.540719 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.540741 kubelet[2476]: W0513 00:25:31.540732 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.540814 kubelet[2476]: E0513 00:25:31.540746 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.541018 kubelet[2476]: E0513 00:25:31.541001 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.541018 kubelet[2476]: W0513 00:25:31.541014 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.541074 kubelet[2476]: E0513 00:25:31.541025 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.541278 kubelet[2476]: E0513 00:25:31.541261 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.541371 kubelet[2476]: W0513 00:25:31.541353 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.541371 kubelet[2476]: E0513 00:25:31.541380 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.541646 kubelet[2476]: E0513 00:25:31.541632 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.541646 kubelet[2476]: W0513 00:25:31.541643 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.541706 kubelet[2476]: E0513 00:25:31.541653 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.541937 kubelet[2476]: E0513 00:25:31.541886 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.541937 kubelet[2476]: W0513 00:25:31.541898 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.541937 kubelet[2476]: E0513 00:25:31.541906 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.542152 kubelet[2476]: E0513 00:25:31.542129 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.542152 kubelet[2476]: W0513 00:25:31.542142 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.542152 kubelet[2476]: E0513 00:25:31.542150 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.589553 kubelet[2476]: E0513 00:25:31.588505 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:31.589905 containerd[1455]: time="2025-05-13T00:25:31.589698773Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-787554bc4-8k88b,Uid:17d1d7ef-68e6-4526-90e9-cb7aee244b81,Namespace:calico-system,Attempt:0,}" May 13 00:25:31.618769 containerd[1455]: time="2025-05-13T00:25:31.618647147Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:25:31.618769 containerd[1455]: time="2025-05-13T00:25:31.618710037Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:25:31.618769 containerd[1455]: time="2025-05-13T00:25:31.618722380Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:31.619031 containerd[1455]: time="2025-05-13T00:25:31.618824795Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:31.640515 kubelet[2476]: E0513 00:25:31.640472 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:31.641278 containerd[1455]: time="2025-05-13T00:25:31.641233719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-lj45c,Uid:6da78090-2c51-4b10-b9d4-8db249fc4886,Namespace:calico-system,Attempt:0,}" May 13 00:25:31.641860 kubelet[2476]: E0513 00:25:31.641825 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.641860 kubelet[2476]: W0513 00:25:31.641850 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.641947 kubelet[2476]: E0513 00:25:31.641870 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.642260 kubelet[2476]: E0513 00:25:31.642233 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.642260 kubelet[2476]: W0513 00:25:31.642247 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.642260 kubelet[2476]: E0513 00:25:31.642263 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.642688 kubelet[2476]: E0513 00:25:31.642658 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.642688 kubelet[2476]: W0513 00:25:31.642679 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.642771 kubelet[2476]: E0513 00:25:31.642712 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.642987 kubelet[2476]: E0513 00:25:31.642962 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.643040 kubelet[2476]: W0513 00:25:31.642975 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.643040 kubelet[2476]: E0513 00:25:31.643022 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.643384 kubelet[2476]: E0513 00:25:31.643350 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.643384 kubelet[2476]: W0513 00:25:31.643370 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.643462 kubelet[2476]: E0513 00:25:31.643399 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.643689 kubelet[2476]: E0513 00:25:31.643662 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.643689 kubelet[2476]: W0513 00:25:31.643678 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.643689 kubelet[2476]: E0513 00:25:31.643691 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.643965 kubelet[2476]: E0513 00:25:31.643930 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.643965 kubelet[2476]: W0513 00:25:31.643943 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.644069 kubelet[2476]: E0513 00:25:31.644050 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.644220 kubelet[2476]: E0513 00:25:31.644181 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.644220 kubelet[2476]: W0513 00:25:31.644216 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.644308 kubelet[2476]: E0513 00:25:31.644300 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.644769 kubelet[2476]: E0513 00:25:31.644740 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.644769 kubelet[2476]: W0513 00:25:31.644755 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.644850 kubelet[2476]: E0513 00:25:31.644778 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.645038 kubelet[2476]: E0513 00:25:31.645020 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.645038 kubelet[2476]: W0513 00:25:31.645034 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.645111 kubelet[2476]: E0513 00:25:31.645053 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.645352 kubelet[2476]: E0513 00:25:31.645316 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.645352 kubelet[2476]: W0513 00:25:31.645338 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.645434 kubelet[2476]: E0513 00:25:31.645387 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.645619 kubelet[2476]: E0513 00:25:31.645602 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.645619 kubelet[2476]: W0513 00:25:31.645614 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.645697 kubelet[2476]: E0513 00:25:31.645660 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.645734 systemd[1]: Started cri-containerd-4a4fd5049c466c0795efd3fae6885c5e0d165f0ce54bfb577c6b01bf545c2398.scope - libcontainer container 4a4fd5049c466c0795efd3fae6885c5e0d165f0ce54bfb577c6b01bf545c2398. May 13 00:25:31.646039 kubelet[2476]: E0513 00:25:31.645965 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.646039 kubelet[2476]: W0513 00:25:31.645974 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.646121 kubelet[2476]: E0513 00:25:31.646059 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.646365 kubelet[2476]: E0513 00:25:31.646338 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.646365 kubelet[2476]: W0513 00:25:31.646353 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.646459 kubelet[2476]: E0513 00:25:31.646441 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.646622 kubelet[2476]: E0513 00:25:31.646600 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.646837 kubelet[2476]: W0513 00:25:31.646755 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.646837 kubelet[2476]: E0513 00:25:31.646782 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.647202 kubelet[2476]: E0513 00:25:31.647139 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.647202 kubelet[2476]: W0513 00:25:31.647149 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.647353 kubelet[2476]: E0513 00:25:31.647339 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.647709 kubelet[2476]: E0513 00:25:31.647697 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.647786 kubelet[2476]: W0513 00:25:31.647775 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.647984 kubelet[2476]: E0513 00:25:31.647972 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.648212 kubelet[2476]: E0513 00:25:31.648152 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.648212 kubelet[2476]: W0513 00:25:31.648162 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.648349 kubelet[2476]: E0513 00:25:31.648302 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.648735 kubelet[2476]: E0513 00:25:31.648680 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.648735 kubelet[2476]: W0513 00:25:31.648693 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.648913 kubelet[2476]: E0513 00:25:31.648845 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.649925 kubelet[2476]: E0513 00:25:31.649875 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.649925 kubelet[2476]: W0513 00:25:31.649891 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.650136 kubelet[2476]: E0513 00:25:31.650037 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.650618 kubelet[2476]: E0513 00:25:31.650522 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.650806 kubelet[2476]: W0513 00:25:31.650731 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.650973 kubelet[2476]: E0513 00:25:31.650897 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.651483 kubelet[2476]: E0513 00:25:31.651353 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.651483 kubelet[2476]: W0513 00:25:31.651367 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.651696 kubelet[2476]: E0513 00:25:31.651616 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.652297 kubelet[2476]: E0513 00:25:31.652158 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.652297 kubelet[2476]: W0513 00:25:31.652185 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.652453 kubelet[2476]: E0513 00:25:31.652303 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.652499 kubelet[2476]: E0513 00:25:31.652486 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.652566 kubelet[2476]: W0513 00:25:31.652501 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.652736 kubelet[2476]: E0513 00:25:31.652533 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.652965 kubelet[2476]: E0513 00:25:31.652940 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.652965 kubelet[2476]: W0513 00:25:31.652953 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.652965 kubelet[2476]: E0513 00:25:31.652963 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.655812 kubelet[2476]: E0513 00:25:31.655776 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.655812 kubelet[2476]: W0513 00:25:31.655790 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.655812 kubelet[2476]: E0513 00:25:31.655802 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.689650 containerd[1455]: time="2025-05-13T00:25:31.689135822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-787554bc4-8k88b,Uid:17d1d7ef-68e6-4526-90e9-cb7aee244b81,Namespace:calico-system,Attempt:0,} returns sandbox id \"4a4fd5049c466c0795efd3fae6885c5e0d165f0ce54bfb577c6b01bf545c2398\"" May 13 00:25:31.690041 kubelet[2476]: E0513 00:25:31.690011 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:31.692619 containerd[1455]: time="2025-05-13T00:25:31.691846207Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 13 00:25:31.791057 containerd[1455]: time="2025-05-13T00:25:31.790919993Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:25:31.791057 containerd[1455]: time="2025-05-13T00:25:31.790996148Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:25:31.791057 containerd[1455]: time="2025-05-13T00:25:31.791009945Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:31.791321 containerd[1455]: time="2025-05-13T00:25:31.791114153Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:31.811686 systemd[1]: Started cri-containerd-7d3c2babe03c3654643def88a1435ab533bcc25f8658501e8e3ccab310e64d6f.scope - libcontainer container 7d3c2babe03c3654643def88a1435ab533bcc25f8658501e8e3ccab310e64d6f. May 13 00:25:31.840819 containerd[1455]: time="2025-05-13T00:25:31.840704964Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-lj45c,Uid:6da78090-2c51-4b10-b9d4-8db249fc4886,Namespace:calico-system,Attempt:0,} returns sandbox id \"7d3c2babe03c3654643def88a1435ab533bcc25f8658501e8e3ccab310e64d6f\"" May 13 00:25:31.842148 kubelet[2476]: E0513 00:25:31.841640 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:31.855244 kubelet[2476]: E0513 00:25:31.855163 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:31.937238 kubelet[2476]: E0513 00:25:31.937188 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.937238 kubelet[2476]: W0513 00:25:31.937228 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.937427 kubelet[2476]: E0513 00:25:31.937255 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.937614 kubelet[2476]: E0513 00:25:31.937584 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.937614 kubelet[2476]: W0513 00:25:31.937601 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.937614 kubelet[2476]: E0513 00:25:31.937613 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.937872 kubelet[2476]: E0513 00:25:31.937855 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.937872 kubelet[2476]: W0513 00:25:31.937870 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.937938 kubelet[2476]: E0513 00:25:31.937881 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.938140 kubelet[2476]: E0513 00:25:31.938121 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.938140 kubelet[2476]: W0513 00:25:31.938135 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.938205 kubelet[2476]: E0513 00:25:31.938146 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.938404 kubelet[2476]: E0513 00:25:31.938387 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.938404 kubelet[2476]: W0513 00:25:31.938401 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.938475 kubelet[2476]: E0513 00:25:31.938415 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.938664 kubelet[2476]: E0513 00:25:31.938647 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.938664 kubelet[2476]: W0513 00:25:31.938661 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.938731 kubelet[2476]: E0513 00:25:31.938672 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.938906 kubelet[2476]: E0513 00:25:31.938890 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.938906 kubelet[2476]: W0513 00:25:31.938904 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.938976 kubelet[2476]: E0513 00:25:31.938915 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.939185 kubelet[2476]: E0513 00:25:31.939168 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.939185 kubelet[2476]: W0513 00:25:31.939183 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.939252 kubelet[2476]: E0513 00:25:31.939194 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.939452 kubelet[2476]: E0513 00:25:31.939431 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.939452 kubelet[2476]: W0513 00:25:31.939445 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.939505 kubelet[2476]: E0513 00:25:31.939458 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.939736 kubelet[2476]: E0513 00:25:31.939717 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.939736 kubelet[2476]: W0513 00:25:31.939731 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.939789 kubelet[2476]: E0513 00:25:31.939740 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.939972 kubelet[2476]: E0513 00:25:31.939953 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.939972 kubelet[2476]: W0513 00:25:31.939966 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.940025 kubelet[2476]: E0513 00:25:31.939976 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.940212 kubelet[2476]: E0513 00:25:31.940190 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.940212 kubelet[2476]: W0513 00:25:31.940203 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.940264 kubelet[2476]: E0513 00:25:31.940213 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.940483 kubelet[2476]: E0513 00:25:31.940455 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.940483 kubelet[2476]: W0513 00:25:31.940470 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.940483 kubelet[2476]: E0513 00:25:31.940481 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.940740 kubelet[2476]: E0513 00:25:31.940719 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.940740 kubelet[2476]: W0513 00:25:31.940733 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.940785 kubelet[2476]: E0513 00:25:31.940744 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:31.940974 kubelet[2476]: E0513 00:25:31.940954 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:31.940974 kubelet[2476]: W0513 00:25:31.940968 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:31.941026 kubelet[2476]: E0513 00:25:31.940981 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.560795 kubelet[2476]: E0513 00:25:32.560727 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xgblg" podUID="93974201-6dc9-4f10-8077-3a1e2417dccb" May 13 00:25:32.596718 kubelet[2476]: E0513 00:25:32.596681 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:32.645133 kubelet[2476]: E0513 00:25:32.645100 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.645133 kubelet[2476]: W0513 00:25:32.645125 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.645243 kubelet[2476]: E0513 00:25:32.645151 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.645490 kubelet[2476]: E0513 00:25:32.645466 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.645490 kubelet[2476]: W0513 00:25:32.645482 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.645556 kubelet[2476]: E0513 00:25:32.645493 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.645774 kubelet[2476]: E0513 00:25:32.645750 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.645774 kubelet[2476]: W0513 00:25:32.645767 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.645822 kubelet[2476]: E0513 00:25:32.645778 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.646030 kubelet[2476]: E0513 00:25:32.646006 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.646030 kubelet[2476]: W0513 00:25:32.646022 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.646076 kubelet[2476]: E0513 00:25:32.646033 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.646274 kubelet[2476]: E0513 00:25:32.646251 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.646274 kubelet[2476]: W0513 00:25:32.646266 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.646339 kubelet[2476]: E0513 00:25:32.646277 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.646534 kubelet[2476]: E0513 00:25:32.646511 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.646577 kubelet[2476]: W0513 00:25:32.646529 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.646577 kubelet[2476]: E0513 00:25:32.646564 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.646807 kubelet[2476]: E0513 00:25:32.646784 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.646807 kubelet[2476]: W0513 00:25:32.646799 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.646857 kubelet[2476]: E0513 00:25:32.646813 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.647038 kubelet[2476]: E0513 00:25:32.647023 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.647038 kubelet[2476]: W0513 00:25:32.647037 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.647086 kubelet[2476]: E0513 00:25:32.647047 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.647276 kubelet[2476]: E0513 00:25:32.647260 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.647297 kubelet[2476]: W0513 00:25:32.647274 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.647297 kubelet[2476]: E0513 00:25:32.647286 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.647525 kubelet[2476]: E0513 00:25:32.647509 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.647570 kubelet[2476]: W0513 00:25:32.647523 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.647570 kubelet[2476]: E0513 00:25:32.647555 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.647777 kubelet[2476]: E0513 00:25:32.647761 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.647806 kubelet[2476]: W0513 00:25:32.647775 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.647806 kubelet[2476]: E0513 00:25:32.647786 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.648021 kubelet[2476]: E0513 00:25:32.648006 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.648021 kubelet[2476]: W0513 00:25:32.648019 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.648069 kubelet[2476]: E0513 00:25:32.648031 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.648278 kubelet[2476]: E0513 00:25:32.648255 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.648278 kubelet[2476]: W0513 00:25:32.648270 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.648338 kubelet[2476]: E0513 00:25:32.648281 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.648528 kubelet[2476]: E0513 00:25:32.648512 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.648528 kubelet[2476]: W0513 00:25:32.648525 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.648599 kubelet[2476]: E0513 00:25:32.648553 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:32.648782 kubelet[2476]: E0513 00:25:32.648766 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:32.648808 kubelet[2476]: W0513 00:25:32.648780 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:32.648808 kubelet[2476]: E0513 00:25:32.648791 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:33.492004 kubelet[2476]: E0513 00:25:33.491974 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:33.554022 kubelet[2476]: E0513 00:25:33.553991 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:33.554022 kubelet[2476]: W0513 00:25:33.554011 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:33.554022 kubelet[2476]: E0513 00:25:33.554031 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:33.554255 kubelet[2476]: E0513 00:25:33.554240 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:33.554255 kubelet[2476]: W0513 00:25:33.554254 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:33.554304 kubelet[2476]: E0513 00:25:33.554265 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:33.554508 kubelet[2476]: E0513 00:25:33.554478 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:33.554508 kubelet[2476]: W0513 00:25:33.554499 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:33.554508 kubelet[2476]: E0513 00:25:33.554508 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:33.554847 kubelet[2476]: E0513 00:25:33.554811 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:33.554847 kubelet[2476]: W0513 00:25:33.554836 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:33.554847 kubelet[2476]: E0513 00:25:33.554845 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:33.555094 kubelet[2476]: E0513 00:25:33.555079 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:33.555094 kubelet[2476]: W0513 00:25:33.555090 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:33.555145 kubelet[2476]: E0513 00:25:33.555099 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:33.598052 kubelet[2476]: E0513 00:25:33.598004 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:33.656089 kubelet[2476]: E0513 00:25:33.656029 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:33.656089 kubelet[2476]: W0513 00:25:33.656052 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:33.656089 kubelet[2476]: E0513 00:25:33.656074 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:33.656654 kubelet[2476]: E0513 00:25:33.656313 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:33.656654 kubelet[2476]: W0513 00:25:33.656322 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:33.656654 kubelet[2476]: E0513 00:25:33.656339 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:33.656654 kubelet[2476]: E0513 00:25:33.656599 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:33.656654 kubelet[2476]: W0513 00:25:33.656606 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:33.656654 kubelet[2476]: E0513 00:25:33.656615 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:33.656879 kubelet[2476]: E0513 00:25:33.656811 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:33.656879 kubelet[2476]: W0513 00:25:33.656819 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:33.656879 kubelet[2476]: E0513 00:25:33.656827 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:33.657076 kubelet[2476]: E0513 00:25:33.657052 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:33.657076 kubelet[2476]: W0513 00:25:33.657064 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:33.657076 kubelet[2476]: E0513 00:25:33.657072 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.301789 containerd[1455]: time="2025-05-13T00:25:34.301735446Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:34.302654 containerd[1455]: time="2025-05-13T00:25:34.302591013Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 13 00:25:34.304058 containerd[1455]: time="2025-05-13T00:25:34.304027703Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:34.306243 containerd[1455]: time="2025-05-13T00:25:34.306211784Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:34.306858 containerd[1455]: time="2025-05-13T00:25:34.306814219Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 2.614941542s" May 13 00:25:34.306893 containerd[1455]: time="2025-05-13T00:25:34.306857060Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 13 00:25:34.307913 containerd[1455]: time="2025-05-13T00:25:34.307885664Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 13 00:25:34.316534 containerd[1455]: time="2025-05-13T00:25:34.316496368Z" level=info msg="CreateContainer within sandbox \"4a4fd5049c466c0795efd3fae6885c5e0d165f0ce54bfb577c6b01bf545c2398\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 13 00:25:34.332263 containerd[1455]: time="2025-05-13T00:25:34.332219698Z" level=info msg="CreateContainer within sandbox \"4a4fd5049c466c0795efd3fae6885c5e0d165f0ce54bfb577c6b01bf545c2398\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ebe9c9d89e42172802fda02cb372c2d4c01b3362bad91cf3d7654b80eebe81b8\"" May 13 00:25:34.332791 containerd[1455]: time="2025-05-13T00:25:34.332766076Z" level=info msg="StartContainer for \"ebe9c9d89e42172802fda02cb372c2d4c01b3362bad91cf3d7654b80eebe81b8\"" May 13 00:25:34.363690 systemd[1]: Started cri-containerd-ebe9c9d89e42172802fda02cb372c2d4c01b3362bad91cf3d7654b80eebe81b8.scope - libcontainer container ebe9c9d89e42172802fda02cb372c2d4c01b3362bad91cf3d7654b80eebe81b8. May 13 00:25:34.457898 update_engine[1439]: I20250513 00:25:34.457823 1439 update_attempter.cc:509] Updating boot flags... May 13 00:25:34.467860 containerd[1455]: time="2025-05-13T00:25:34.467733131Z" level=info msg="StartContainer for \"ebe9c9d89e42172802fda02cb372c2d4c01b3362bad91cf3d7654b80eebe81b8\" returns successfully" May 13 00:25:34.491582 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (3182) May 13 00:25:34.534632 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (3182) May 13 00:25:34.562244 kubelet[2476]: E0513 00:25:34.561182 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xgblg" podUID="93974201-6dc9-4f10-8077-3a1e2417dccb" May 13 00:25:34.574575 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (3182) May 13 00:25:34.603721 kubelet[2476]: E0513 00:25:34.603683 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:34.663357 kubelet[2476]: E0513 00:25:34.663321 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.663357 kubelet[2476]: W0513 00:25:34.663348 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.663506 kubelet[2476]: E0513 00:25:34.663370 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.663706 kubelet[2476]: E0513 00:25:34.663679 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.663706 kubelet[2476]: W0513 00:25:34.663700 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.663839 kubelet[2476]: E0513 00:25:34.663724 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.664074 kubelet[2476]: E0513 00:25:34.664058 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.664074 kubelet[2476]: W0513 00:25:34.664070 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.664138 kubelet[2476]: E0513 00:25:34.664079 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.664316 kubelet[2476]: E0513 00:25:34.664291 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.664316 kubelet[2476]: W0513 00:25:34.664307 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.664374 kubelet[2476]: E0513 00:25:34.664318 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.664683 kubelet[2476]: E0513 00:25:34.664664 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.664683 kubelet[2476]: W0513 00:25:34.664678 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.664752 kubelet[2476]: E0513 00:25:34.664689 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.664921 kubelet[2476]: E0513 00:25:34.664904 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.664921 kubelet[2476]: W0513 00:25:34.664918 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.664972 kubelet[2476]: E0513 00:25:34.664928 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.665165 kubelet[2476]: E0513 00:25:34.665147 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.665165 kubelet[2476]: W0513 00:25:34.665162 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.665219 kubelet[2476]: E0513 00:25:34.665174 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.665429 kubelet[2476]: E0513 00:25:34.665412 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.665429 kubelet[2476]: W0513 00:25:34.665426 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.665494 kubelet[2476]: E0513 00:25:34.665440 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.665739 kubelet[2476]: E0513 00:25:34.665721 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.665739 kubelet[2476]: W0513 00:25:34.665735 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.665815 kubelet[2476]: E0513 00:25:34.665746 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.665981 kubelet[2476]: E0513 00:25:34.665965 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.665981 kubelet[2476]: W0513 00:25:34.665979 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.666032 kubelet[2476]: E0513 00:25:34.665990 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.666219 kubelet[2476]: E0513 00:25:34.666202 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.666219 kubelet[2476]: W0513 00:25:34.666216 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.666270 kubelet[2476]: E0513 00:25:34.666228 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.666464 kubelet[2476]: E0513 00:25:34.666448 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.666464 kubelet[2476]: W0513 00:25:34.666463 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.666515 kubelet[2476]: E0513 00:25:34.666477 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.666771 kubelet[2476]: E0513 00:25:34.666753 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.666771 kubelet[2476]: W0513 00:25:34.666768 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.666828 kubelet[2476]: E0513 00:25:34.666778 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.667023 kubelet[2476]: E0513 00:25:34.667007 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.667023 kubelet[2476]: W0513 00:25:34.667021 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.667072 kubelet[2476]: E0513 00:25:34.667032 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.667281 kubelet[2476]: E0513 00:25:34.667257 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.667281 kubelet[2476]: W0513 00:25:34.667272 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.667337 kubelet[2476]: E0513 00:25:34.667283 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.667640 kubelet[2476]: E0513 00:25:34.667623 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.667640 kubelet[2476]: W0513 00:25:34.667637 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.667693 kubelet[2476]: E0513 00:25:34.667648 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.667928 kubelet[2476]: E0513 00:25:34.667911 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.667928 kubelet[2476]: W0513 00:25:34.667924 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.667977 kubelet[2476]: E0513 00:25:34.667942 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.668224 kubelet[2476]: E0513 00:25:34.668207 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.668224 kubelet[2476]: W0513 00:25:34.668221 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.668277 kubelet[2476]: E0513 00:25:34.668238 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.668575 kubelet[2476]: E0513 00:25:34.668525 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.668614 kubelet[2476]: W0513 00:25:34.668574 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.668614 kubelet[2476]: E0513 00:25:34.668602 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.668857 kubelet[2476]: E0513 00:25:34.668834 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.668885 kubelet[2476]: W0513 00:25:34.668856 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.668918 kubelet[2476]: E0513 00:25:34.668888 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.669167 kubelet[2476]: E0513 00:25:34.669143 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.669167 kubelet[2476]: W0513 00:25:34.669154 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.669219 kubelet[2476]: E0513 00:25:34.669169 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.669459 kubelet[2476]: E0513 00:25:34.669444 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.669459 kubelet[2476]: W0513 00:25:34.669454 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.669519 kubelet[2476]: E0513 00:25:34.669501 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.669692 kubelet[2476]: E0513 00:25:34.669677 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.669692 kubelet[2476]: W0513 00:25:34.669688 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.669753 kubelet[2476]: E0513 00:25:34.669728 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.669925 kubelet[2476]: E0513 00:25:34.669901 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.669925 kubelet[2476]: W0513 00:25:34.669921 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.670034 kubelet[2476]: E0513 00:25:34.669937 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.670157 kubelet[2476]: E0513 00:25:34.670139 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.670191 kubelet[2476]: W0513 00:25:34.670159 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.670191 kubelet[2476]: E0513 00:25:34.670176 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.670407 kubelet[2476]: E0513 00:25:34.670390 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.670407 kubelet[2476]: W0513 00:25:34.670400 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.670486 kubelet[2476]: E0513 00:25:34.670414 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.670674 kubelet[2476]: E0513 00:25:34.670656 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.670674 kubelet[2476]: W0513 00:25:34.670670 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.670768 kubelet[2476]: E0513 00:25:34.670685 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.670903 kubelet[2476]: E0513 00:25:34.670883 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.670903 kubelet[2476]: W0513 00:25:34.670898 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.670975 kubelet[2476]: E0513 00:25:34.670914 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.671222 kubelet[2476]: E0513 00:25:34.671198 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.671222 kubelet[2476]: W0513 00:25:34.671212 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.671403 kubelet[2476]: E0513 00:25:34.671306 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.672149 kubelet[2476]: E0513 00:25:34.672036 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.672149 kubelet[2476]: W0513 00:25:34.672055 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.672149 kubelet[2476]: E0513 00:25:34.672078 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.672307 kubelet[2476]: E0513 00:25:34.672283 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.672307 kubelet[2476]: W0513 00:25:34.672300 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.672781 kubelet[2476]: E0513 00:25:34.672408 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.672781 kubelet[2476]: E0513 00:25:34.672485 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.672781 kubelet[2476]: W0513 00:25:34.672494 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.672781 kubelet[2476]: E0513 00:25:34.672508 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:34.673162 kubelet[2476]: E0513 00:25:34.673133 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:34.673162 kubelet[2476]: W0513 00:25:34.673151 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:34.673162 kubelet[2476]: E0513 00:25:34.673161 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.605788 kubelet[2476]: I0513 00:25:35.605755 2476 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 00:25:35.606203 kubelet[2476]: E0513 00:25:35.606096 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:35.675996 kubelet[2476]: E0513 00:25:35.675959 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.675996 kubelet[2476]: W0513 00:25:35.675985 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.675996 kubelet[2476]: E0513 00:25:35.676006 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.676235 kubelet[2476]: E0513 00:25:35.676220 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.676235 kubelet[2476]: W0513 00:25:35.676231 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.676333 kubelet[2476]: E0513 00:25:35.676242 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.676660 kubelet[2476]: E0513 00:25:35.676630 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.676692 kubelet[2476]: W0513 00:25:35.676660 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.676722 kubelet[2476]: E0513 00:25:35.676690 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.677196 kubelet[2476]: E0513 00:25:35.677023 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.677196 kubelet[2476]: W0513 00:25:35.677037 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.677196 kubelet[2476]: E0513 00:25:35.677046 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.677402 kubelet[2476]: E0513 00:25:35.677369 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.677441 kubelet[2476]: W0513 00:25:35.677398 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.677441 kubelet[2476]: E0513 00:25:35.677425 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.677787 kubelet[2476]: E0513 00:25:35.677770 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.677787 kubelet[2476]: W0513 00:25:35.677783 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.677855 kubelet[2476]: E0513 00:25:35.677795 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.678050 kubelet[2476]: E0513 00:25:35.678034 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.678050 kubelet[2476]: W0513 00:25:35.678046 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.678096 kubelet[2476]: E0513 00:25:35.678055 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.678358 kubelet[2476]: E0513 00:25:35.678317 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.678358 kubelet[2476]: W0513 00:25:35.678349 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.678358 kubelet[2476]: E0513 00:25:35.678359 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.678621 kubelet[2476]: E0513 00:25:35.678592 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.678621 kubelet[2476]: W0513 00:25:35.678607 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.678621 kubelet[2476]: E0513 00:25:35.678617 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.678836 kubelet[2476]: E0513 00:25:35.678816 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.678836 kubelet[2476]: W0513 00:25:35.678826 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.678836 kubelet[2476]: E0513 00:25:35.678833 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.679083 kubelet[2476]: E0513 00:25:35.679059 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.679083 kubelet[2476]: W0513 00:25:35.679074 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.679155 kubelet[2476]: E0513 00:25:35.679086 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.679331 kubelet[2476]: E0513 00:25:35.679308 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.679331 kubelet[2476]: W0513 00:25:35.679319 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.679421 kubelet[2476]: E0513 00:25:35.679336 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.679611 kubelet[2476]: E0513 00:25:35.679585 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.679780 kubelet[2476]: W0513 00:25:35.679661 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.679780 kubelet[2476]: E0513 00:25:35.679679 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.679996 kubelet[2476]: E0513 00:25:35.679972 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.679996 kubelet[2476]: W0513 00:25:35.679987 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.679996 kubelet[2476]: E0513 00:25:35.680001 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.680263 kubelet[2476]: E0513 00:25:35.680246 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.680263 kubelet[2476]: W0513 00:25:35.680260 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.680313 kubelet[2476]: E0513 00:25:35.680272 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.776622 kubelet[2476]: E0513 00:25:35.776531 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.776622 kubelet[2476]: W0513 00:25:35.776575 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.776622 kubelet[2476]: E0513 00:25:35.776596 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.776844 kubelet[2476]: E0513 00:25:35.776830 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.776844 kubelet[2476]: W0513 00:25:35.776839 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.776893 kubelet[2476]: E0513 00:25:35.776852 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.777103 kubelet[2476]: E0513 00:25:35.777081 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.777103 kubelet[2476]: W0513 00:25:35.777093 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.777165 kubelet[2476]: E0513 00:25:35.777105 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.777467 kubelet[2476]: E0513 00:25:35.777425 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.777498 kubelet[2476]: W0513 00:25:35.777461 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.777521 kubelet[2476]: E0513 00:25:35.777498 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.777826 kubelet[2476]: E0513 00:25:35.777808 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.777826 kubelet[2476]: W0513 00:25:35.777822 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.777888 kubelet[2476]: E0513 00:25:35.777840 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.778132 kubelet[2476]: E0513 00:25:35.778115 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.778132 kubelet[2476]: W0513 00:25:35.778128 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.778217 kubelet[2476]: E0513 00:25:35.778178 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.778367 kubelet[2476]: E0513 00:25:35.778352 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.778367 kubelet[2476]: W0513 00:25:35.778365 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.778416 kubelet[2476]: E0513 00:25:35.778396 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.778598 kubelet[2476]: E0513 00:25:35.778582 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.778598 kubelet[2476]: W0513 00:25:35.778595 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.778647 kubelet[2476]: E0513 00:25:35.778625 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.778845 kubelet[2476]: E0513 00:25:35.778830 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.778845 kubelet[2476]: W0513 00:25:35.778843 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.778950 kubelet[2476]: E0513 00:25:35.778861 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.779234 kubelet[2476]: E0513 00:25:35.779213 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.779266 kubelet[2476]: W0513 00:25:35.779234 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.779266 kubelet[2476]: E0513 00:25:35.779255 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.779478 kubelet[2476]: E0513 00:25:35.779464 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.779478 kubelet[2476]: W0513 00:25:35.779475 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.779530 kubelet[2476]: E0513 00:25:35.779488 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.779730 kubelet[2476]: E0513 00:25:35.779713 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.779730 kubelet[2476]: W0513 00:25:35.779728 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.779778 kubelet[2476]: E0513 00:25:35.779745 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.779961 kubelet[2476]: E0513 00:25:35.779946 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.779961 kubelet[2476]: W0513 00:25:35.779959 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.780012 kubelet[2476]: E0513 00:25:35.779975 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.780213 kubelet[2476]: E0513 00:25:35.780197 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.780213 kubelet[2476]: W0513 00:25:35.780211 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.780262 kubelet[2476]: E0513 00:25:35.780226 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.780469 kubelet[2476]: E0513 00:25:35.780456 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.780469 kubelet[2476]: W0513 00:25:35.780466 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.780518 kubelet[2476]: E0513 00:25:35.780480 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.780814 kubelet[2476]: E0513 00:25:35.780799 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.780814 kubelet[2476]: W0513 00:25:35.780810 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.780861 kubelet[2476]: E0513 00:25:35.780819 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.781062 kubelet[2476]: E0513 00:25:35.781048 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.781062 kubelet[2476]: W0513 00:25:35.781059 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.781116 kubelet[2476]: E0513 00:25:35.781072 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:35.781294 kubelet[2476]: E0513 00:25:35.781279 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:35.781294 kubelet[2476]: W0513 00:25:35.781291 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:35.781347 kubelet[2476]: E0513 00:25:35.781299 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.073560 kubelet[2476]: I0513 00:25:36.073470 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-787554bc4-8k88b" podStartSLOduration=2.455941507 podStartE2EDuration="5.072357043s" podCreationTimestamp="2025-05-13 00:25:31 +0000 UTC" firstStartedPulling="2025-05-13 00:25:31.691201659 +0000 UTC m=+13.206781506" lastFinishedPulling="2025-05-13 00:25:34.307617195 +0000 UTC m=+15.823197042" observedRunningTime="2025-05-13 00:25:34.731237232 +0000 UTC m=+16.246817079" watchObservedRunningTime="2025-05-13 00:25:36.072357043 +0000 UTC m=+17.587936890" May 13 00:25:36.561312 kubelet[2476]: E0513 00:25:36.561269 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xgblg" podUID="93974201-6dc9-4f10-8077-3a1e2417dccb" May 13 00:25:36.591781 containerd[1455]: time="2025-05-13T00:25:36.591722912Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:36.592619 containerd[1455]: time="2025-05-13T00:25:36.592572184Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 13 00:25:36.593707 containerd[1455]: time="2025-05-13T00:25:36.593678372Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:36.595928 containerd[1455]: time="2025-05-13T00:25:36.595899386Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:36.596616 containerd[1455]: time="2025-05-13T00:25:36.596561041Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 2.288643497s" May 13 00:25:36.596616 containerd[1455]: time="2025-05-13T00:25:36.596590648Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 13 00:25:36.598633 containerd[1455]: time="2025-05-13T00:25:36.598582848Z" level=info msg="CreateContainer within sandbox \"7d3c2babe03c3654643def88a1435ab533bcc25f8658501e8e3ccab310e64d6f\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 00:25:36.606853 kubelet[2476]: E0513 00:25:36.606820 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:36.615282 containerd[1455]: time="2025-05-13T00:25:36.615223251Z" level=info msg="CreateContainer within sandbox \"7d3c2babe03c3654643def88a1435ab533bcc25f8658501e8e3ccab310e64d6f\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0bdeb542d5d5bcc03652266169c461fbaeecf6135535393126b6e74919cd032b\"" May 13 00:25:36.616099 containerd[1455]: time="2025-05-13T00:25:36.616045391Z" level=info msg="StartContainer for \"0bdeb542d5d5bcc03652266169c461fbaeecf6135535393126b6e74919cd032b\"" May 13 00:25:36.656709 systemd[1]: Started cri-containerd-0bdeb542d5d5bcc03652266169c461fbaeecf6135535393126b6e74919cd032b.scope - libcontainer container 0bdeb542d5d5bcc03652266169c461fbaeecf6135535393126b6e74919cd032b. May 13 00:25:36.685852 kubelet[2476]: E0513 00:25:36.685816 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.686596 kubelet[2476]: W0513 00:25:36.685959 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.688567 containerd[1455]: time="2025-05-13T00:25:36.688514070Z" level=info msg="StartContainer for \"0bdeb542d5d5bcc03652266169c461fbaeecf6135535393126b6e74919cd032b\" returns successfully" May 13 00:25:36.688891 kubelet[2476]: E0513 00:25:36.688826 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.689144 kubelet[2476]: E0513 00:25:36.689101 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.689144 kubelet[2476]: W0513 00:25:36.689119 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.689144 kubelet[2476]: E0513 00:25:36.689132 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.689555 kubelet[2476]: E0513 00:25:36.689418 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.689555 kubelet[2476]: W0513 00:25:36.689440 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.689555 kubelet[2476]: E0513 00:25:36.689465 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.689933 kubelet[2476]: E0513 00:25:36.689920 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.690090 kubelet[2476]: W0513 00:25:36.690004 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.690090 kubelet[2476]: E0513 00:25:36.690017 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.690415 kubelet[2476]: E0513 00:25:36.690327 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.690415 kubelet[2476]: W0513 00:25:36.690338 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.690415 kubelet[2476]: E0513 00:25:36.690347 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.691088 kubelet[2476]: E0513 00:25:36.690963 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.691088 kubelet[2476]: W0513 00:25:36.690976 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.691088 kubelet[2476]: E0513 00:25:36.690998 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.691438 kubelet[2476]: E0513 00:25:36.691308 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.691438 kubelet[2476]: W0513 00:25:36.691330 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.691438 kubelet[2476]: E0513 00:25:36.691357 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.691614 kubelet[2476]: E0513 00:25:36.691602 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.691678 kubelet[2476]: W0513 00:25:36.691666 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.691728 kubelet[2476]: E0513 00:25:36.691717 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.692201 kubelet[2476]: E0513 00:25:36.692096 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.692201 kubelet[2476]: W0513 00:25:36.692109 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.692201 kubelet[2476]: E0513 00:25:36.692120 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.692484 kubelet[2476]: E0513 00:25:36.692389 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.692484 kubelet[2476]: W0513 00:25:36.692400 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.692484 kubelet[2476]: E0513 00:25:36.692409 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.692673 kubelet[2476]: E0513 00:25:36.692662 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.692726 kubelet[2476]: W0513 00:25:36.692715 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.692775 kubelet[2476]: E0513 00:25:36.692765 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.693027 kubelet[2476]: E0513 00:25:36.693015 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.693090 kubelet[2476]: W0513 00:25:36.693078 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.693222 kubelet[2476]: E0513 00:25:36.693131 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.693459 kubelet[2476]: E0513 00:25:36.693353 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.693459 kubelet[2476]: W0513 00:25:36.693363 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.693459 kubelet[2476]: E0513 00:25:36.693372 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.693625 kubelet[2476]: E0513 00:25:36.693613 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.693693 kubelet[2476]: W0513 00:25:36.693681 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.693742 kubelet[2476]: E0513 00:25:36.693732 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.693995 kubelet[2476]: E0513 00:25:36.693983 2476 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 00:25:36.694102 kubelet[2476]: W0513 00:25:36.694038 2476 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 00:25:36.694102 kubelet[2476]: E0513 00:25:36.694050 2476 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 00:25:36.701684 systemd[1]: cri-containerd-0bdeb542d5d5bcc03652266169c461fbaeecf6135535393126b6e74919cd032b.scope: Deactivated successfully. May 13 00:25:36.726110 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0bdeb542d5d5bcc03652266169c461fbaeecf6135535393126b6e74919cd032b-rootfs.mount: Deactivated successfully. May 13 00:25:36.745004 containerd[1455]: time="2025-05-13T00:25:36.742862860Z" level=info msg="shim disconnected" id=0bdeb542d5d5bcc03652266169c461fbaeecf6135535393126b6e74919cd032b namespace=k8s.io May 13 00:25:36.745004 containerd[1455]: time="2025-05-13T00:25:36.744997881Z" level=warning msg="cleaning up after shim disconnected" id=0bdeb542d5d5bcc03652266169c461fbaeecf6135535393126b6e74919cd032b namespace=k8s.io May 13 00:25:36.745004 containerd[1455]: time="2025-05-13T00:25:36.745009082Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 13 00:25:37.609135 kubelet[2476]: E0513 00:25:37.609090 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:37.609688 kubelet[2476]: E0513 00:25:37.609167 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:37.610285 containerd[1455]: time="2025-05-13T00:25:37.609759774Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 13 00:25:38.560458 kubelet[2476]: E0513 00:25:38.560408 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xgblg" podUID="93974201-6dc9-4f10-8077-3a1e2417dccb" May 13 00:25:40.560453 kubelet[2476]: E0513 00:25:40.560397 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xgblg" podUID="93974201-6dc9-4f10-8077-3a1e2417dccb" May 13 00:25:42.560551 kubelet[2476]: E0513 00:25:42.560507 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xgblg" podUID="93974201-6dc9-4f10-8077-3a1e2417dccb" May 13 00:25:43.301069 containerd[1455]: time="2025-05-13T00:25:43.301016051Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:43.301869 containerd[1455]: time="2025-05-13T00:25:43.301809260Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 13 00:25:43.302756 containerd[1455]: time="2025-05-13T00:25:43.302723197Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:43.304862 containerd[1455]: time="2025-05-13T00:25:43.304830790Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:43.305391 containerd[1455]: time="2025-05-13T00:25:43.305359739Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 5.695558967s" May 13 00:25:43.305421 containerd[1455]: time="2025-05-13T00:25:43.305389975Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 13 00:25:43.311295 containerd[1455]: time="2025-05-13T00:25:43.311263804Z" level=info msg="CreateContainer within sandbox \"7d3c2babe03c3654643def88a1435ab533bcc25f8658501e8e3ccab310e64d6f\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 00:25:43.326870 containerd[1455]: time="2025-05-13T00:25:43.326832781Z" level=info msg="CreateContainer within sandbox \"7d3c2babe03c3654643def88a1435ab533bcc25f8658501e8e3ccab310e64d6f\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8d273ba97f9cfe7e218ff251dbd9de6249b9db6e315c50509fdbc126618c9701\"" May 13 00:25:43.327333 containerd[1455]: time="2025-05-13T00:25:43.327303872Z" level=info msg="StartContainer for \"8d273ba97f9cfe7e218ff251dbd9de6249b9db6e315c50509fdbc126618c9701\"" May 13 00:25:43.358669 systemd[1]: Started cri-containerd-8d273ba97f9cfe7e218ff251dbd9de6249b9db6e315c50509fdbc126618c9701.scope - libcontainer container 8d273ba97f9cfe7e218ff251dbd9de6249b9db6e315c50509fdbc126618c9701. May 13 00:25:43.387836 containerd[1455]: time="2025-05-13T00:25:43.387785351Z" level=info msg="StartContainer for \"8d273ba97f9cfe7e218ff251dbd9de6249b9db6e315c50509fdbc126618c9701\" returns successfully" May 13 00:25:43.758582 systemd[1]: Started sshd@7-10.0.0.89:22-10.0.0.1:38754.service - OpenSSH per-connection server daemon (10.0.0.1:38754). May 13 00:25:43.834170 kubelet[2476]: E0513 00:25:43.834128 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:43.870651 sshd[3399]: Accepted publickey for core from 10.0.0.1 port 38754 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:25:43.872409 sshd[3399]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:25:43.878916 systemd-logind[1436]: New session 8 of user core. May 13 00:25:43.882742 systemd[1]: Started session-8.scope - Session 8 of User core. May 13 00:25:44.007045 sshd[3399]: pam_unix(sshd:session): session closed for user core May 13 00:25:44.011869 systemd[1]: sshd@7-10.0.0.89:22-10.0.0.1:38754.service: Deactivated successfully. May 13 00:25:44.013973 systemd[1]: session-8.scope: Deactivated successfully. May 13 00:25:44.014795 systemd-logind[1436]: Session 8 logged out. Waiting for processes to exit. May 13 00:25:44.015695 systemd-logind[1436]: Removed session 8. May 13 00:25:44.483380 containerd[1455]: time="2025-05-13T00:25:44.483170885Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 00:25:44.486488 systemd[1]: cri-containerd-8d273ba97f9cfe7e218ff251dbd9de6249b9db6e315c50509fdbc126618c9701.scope: Deactivated successfully. May 13 00:25:44.499833 kubelet[2476]: I0513 00:25:44.499802 2476 kubelet_node_status.go:502] "Fast updating node status as it just became ready" May 13 00:25:44.509798 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8d273ba97f9cfe7e218ff251dbd9de6249b9db6e315c50509fdbc126618c9701-rootfs.mount: Deactivated successfully. May 13 00:25:44.545655 systemd[1]: Created slice kubepods-burstable-pod7de40860_c4d9_46ea_8cfc_0e1375b1cb33.slice - libcontainer container kubepods-burstable-pod7de40860_c4d9_46ea_8cfc_0e1375b1cb33.slice. May 13 00:25:44.557646 systemd[1]: Created slice kubepods-besteffort-pod17288722_b2a9_43e7_9881_20dbd0f1ae77.slice - libcontainer container kubepods-besteffort-pod17288722_b2a9_43e7_9881_20dbd0f1ae77.slice. May 13 00:25:44.564033 systemd[1]: Created slice kubepods-burstable-pod44594c6e_aa9f_4fbd_9913_db1bedcc1034.slice - libcontainer container kubepods-burstable-pod44594c6e_aa9f_4fbd_9913_db1bedcc1034.slice. May 13 00:25:44.569095 systemd[1]: Created slice kubepods-besteffort-pod48db1284_6ae2_4942_9a9b_c26e0e5a874e.slice - libcontainer container kubepods-besteffort-pod48db1284_6ae2_4942_9a9b_c26e0e5a874e.slice. May 13 00:25:44.573943 systemd[1]: Created slice kubepods-besteffort-pod3654df1f_4f7d_4d28_8abd_dfe02b50107c.slice - libcontainer container kubepods-besteffort-pod3654df1f_4f7d_4d28_8abd_dfe02b50107c.slice. May 13 00:25:44.578234 systemd[1]: Created slice kubepods-besteffort-pod93974201_6dc9_4f10_8077_3a1e2417dccb.slice - libcontainer container kubepods-besteffort-pod93974201_6dc9_4f10_8077_3a1e2417dccb.slice. May 13 00:25:44.644496 kubelet[2476]: I0513 00:25:44.644439 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbm5g\" (UniqueName: \"kubernetes.io/projected/44594c6e-aa9f-4fbd-9913-db1bedcc1034-kube-api-access-zbm5g\") pod \"coredns-668d6bf9bc-t7b9x\" (UID: \"44594c6e-aa9f-4fbd-9913-db1bedcc1034\") " pod="kube-system/coredns-668d6bf9bc-t7b9x" May 13 00:25:44.644496 kubelet[2476]: I0513 00:25:44.644481 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhqfr\" (UniqueName: \"kubernetes.io/projected/17288722-b2a9-43e7-9881-20dbd0f1ae77-kube-api-access-lhqfr\") pod \"calico-kube-controllers-79ff8699d-dbkcf\" (UID: \"17288722-b2a9-43e7-9881-20dbd0f1ae77\") " pod="calico-system/calico-kube-controllers-79ff8699d-dbkcf" May 13 00:25:44.644496 kubelet[2476]: I0513 00:25:44.644507 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17288722-b2a9-43e7-9881-20dbd0f1ae77-tigera-ca-bundle\") pod \"calico-kube-controllers-79ff8699d-dbkcf\" (UID: \"17288722-b2a9-43e7-9881-20dbd0f1ae77\") " pod="calico-system/calico-kube-controllers-79ff8699d-dbkcf" May 13 00:25:44.644764 kubelet[2476]: I0513 00:25:44.644525 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/48db1284-6ae2-4942-9a9b-c26e0e5a874e-calico-apiserver-certs\") pod \"calico-apiserver-9bb7cb75b-6pgjr\" (UID: \"48db1284-6ae2-4942-9a9b-c26e0e5a874e\") " pod="calico-apiserver/calico-apiserver-9bb7cb75b-6pgjr" May 13 00:25:44.644764 kubelet[2476]: I0513 00:25:44.644649 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8d4\" (UniqueName: \"kubernetes.io/projected/3654df1f-4f7d-4d28-8abd-dfe02b50107c-kube-api-access-8x8d4\") pod \"calico-apiserver-9bb7cb75b-h6m65\" (UID: \"3654df1f-4f7d-4d28-8abd-dfe02b50107c\") " pod="calico-apiserver/calico-apiserver-9bb7cb75b-h6m65" May 13 00:25:44.644764 kubelet[2476]: I0513 00:25:44.644703 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7de40860-c4d9-46ea-8cfc-0e1375b1cb33-config-volume\") pod \"coredns-668d6bf9bc-tclg9\" (UID: \"7de40860-c4d9-46ea-8cfc-0e1375b1cb33\") " pod="kube-system/coredns-668d6bf9bc-tclg9" May 13 00:25:44.644764 kubelet[2476]: I0513 00:25:44.644718 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhkbl\" (UniqueName: \"kubernetes.io/projected/7de40860-c4d9-46ea-8cfc-0e1375b1cb33-kube-api-access-jhkbl\") pod \"coredns-668d6bf9bc-tclg9\" (UID: \"7de40860-c4d9-46ea-8cfc-0e1375b1cb33\") " pod="kube-system/coredns-668d6bf9bc-tclg9" May 13 00:25:44.644764 kubelet[2476]: I0513 00:25:44.644735 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3654df1f-4f7d-4d28-8abd-dfe02b50107c-calico-apiserver-certs\") pod \"calico-apiserver-9bb7cb75b-h6m65\" (UID: \"3654df1f-4f7d-4d28-8abd-dfe02b50107c\") " pod="calico-apiserver/calico-apiserver-9bb7cb75b-h6m65" May 13 00:25:44.644898 kubelet[2476]: I0513 00:25:44.644755 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44594c6e-aa9f-4fbd-9913-db1bedcc1034-config-volume\") pod \"coredns-668d6bf9bc-t7b9x\" (UID: \"44594c6e-aa9f-4fbd-9913-db1bedcc1034\") " pod="kube-system/coredns-668d6bf9bc-t7b9x" May 13 00:25:44.644898 kubelet[2476]: I0513 00:25:44.644770 2476 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6mqf\" (UniqueName: \"kubernetes.io/projected/48db1284-6ae2-4942-9a9b-c26e0e5a874e-kube-api-access-b6mqf\") pod \"calico-apiserver-9bb7cb75b-6pgjr\" (UID: \"48db1284-6ae2-4942-9a9b-c26e0e5a874e\") " pod="calico-apiserver/calico-apiserver-9bb7cb75b-6pgjr" May 13 00:25:44.743868 containerd[1455]: time="2025-05-13T00:25:44.743821430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xgblg,Uid:93974201-6dc9-4f10-8077-3a1e2417dccb,Namespace:calico-system,Attempt:0,}" May 13 00:25:44.834851 kubelet[2476]: E0513 00:25:44.834817 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:44.849308 kubelet[2476]: E0513 00:25:44.849274 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:44.849793 containerd[1455]: time="2025-05-13T00:25:44.849748146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tclg9,Uid:7de40860-c4d9-46ea-8cfc-0e1375b1cb33,Namespace:kube-system,Attempt:0,}" May 13 00:25:44.860695 containerd[1455]: time="2025-05-13T00:25:44.860645037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79ff8699d-dbkcf,Uid:17288722-b2a9-43e7-9881-20dbd0f1ae77,Namespace:calico-system,Attempt:0,}" May 13 00:25:44.867088 kubelet[2476]: E0513 00:25:44.867017 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:44.867469 containerd[1455]: time="2025-05-13T00:25:44.867419511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-t7b9x,Uid:44594c6e-aa9f-4fbd-9913-db1bedcc1034,Namespace:kube-system,Attempt:0,}" May 13 00:25:44.872027 containerd[1455]: time="2025-05-13T00:25:44.871998970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb7cb75b-6pgjr,Uid:48db1284-6ae2-4942-9a9b-c26e0e5a874e,Namespace:calico-apiserver,Attempt:0,}" May 13 00:25:44.876611 containerd[1455]: time="2025-05-13T00:25:44.876535668Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb7cb75b-h6m65,Uid:3654df1f-4f7d-4d28-8abd-dfe02b50107c,Namespace:calico-apiserver,Attempt:0,}" May 13 00:25:45.093112 containerd[1455]: time="2025-05-13T00:25:45.092971243Z" level=info msg="shim disconnected" id=8d273ba97f9cfe7e218ff251dbd9de6249b9db6e315c50509fdbc126618c9701 namespace=k8s.io May 13 00:25:45.093112 containerd[1455]: time="2025-05-13T00:25:45.093021498Z" level=warning msg="cleaning up after shim disconnected" id=8d273ba97f9cfe7e218ff251dbd9de6249b9db6e315c50509fdbc126618c9701 namespace=k8s.io May 13 00:25:45.093112 containerd[1455]: time="2025-05-13T00:25:45.093029874Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 13 00:25:45.121205 containerd[1455]: time="2025-05-13T00:25:45.121162282Z" level=warning msg="cleanup warnings time=\"2025-05-13T00:25:45Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io May 13 00:25:45.260401 containerd[1455]: time="2025-05-13T00:25:45.260315430Z" level=error msg="Failed to destroy network for sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.266209 containerd[1455]: time="2025-05-13T00:25:45.266105230Z" level=error msg="Failed to destroy network for sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.268559 containerd[1455]: time="2025-05-13T00:25:45.268346782Z" level=error msg="encountered an error cleaning up failed sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.268559 containerd[1455]: time="2025-05-13T00:25:45.268394201Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-t7b9x,Uid:44594c6e-aa9f-4fbd-9913-db1bedcc1034,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.270331 containerd[1455]: time="2025-05-13T00:25:45.270301072Z" level=error msg="Failed to destroy network for sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.270715 containerd[1455]: time="2025-05-13T00:25:45.270666551Z" level=error msg="encountered an error cleaning up failed sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.270715 containerd[1455]: time="2025-05-13T00:25:45.270705324Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xgblg,Uid:93974201-6dc9-4f10-8077-3a1e2417dccb,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.271289 containerd[1455]: time="2025-05-13T00:25:45.271267024Z" level=error msg="Failed to destroy network for sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.271637 containerd[1455]: time="2025-05-13T00:25:45.271615764Z" level=error msg="encountered an error cleaning up failed sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.271720 containerd[1455]: time="2025-05-13T00:25:45.271701205Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tclg9,Uid:7de40860-c4d9-46ea-8cfc-0e1375b1cb33,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.283160 containerd[1455]: time="2025-05-13T00:25:45.283092980Z" level=error msg="Failed to destroy network for sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.283674 containerd[1455]: time="2025-05-13T00:25:45.283633630Z" level=error msg="encountered an error cleaning up failed sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.283730 containerd[1455]: time="2025-05-13T00:25:45.283710275Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79ff8699d-dbkcf,Uid:17288722-b2a9-43e7-9881-20dbd0f1ae77,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.284028 containerd[1455]: time="2025-05-13T00:25:45.283967831Z" level=error msg="encountered an error cleaning up failed sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.284028 containerd[1455]: time="2025-05-13T00:25:45.284015722Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb7cb75b-6pgjr,Uid:48db1284-6ae2-4942-9a9b-c26e0e5a874e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.295410 kubelet[2476]: E0513 00:25:45.295363 2476 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.295565 kubelet[2476]: E0513 00:25:45.295433 2476 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xgblg" May 13 00:25:45.295565 kubelet[2476]: E0513 00:25:45.295453 2476 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xgblg" May 13 00:25:45.295565 kubelet[2476]: E0513 00:25:45.295356 2476 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.295565 kubelet[2476]: E0513 00:25:45.295356 2476 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.295679 kubelet[2476]: E0513 00:25:45.295525 2476 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-t7b9x" May 13 00:25:45.295679 kubelet[2476]: E0513 00:25:45.295563 2476 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-t7b9x" May 13 00:25:45.295679 kubelet[2476]: E0513 00:25:45.295601 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-t7b9x_kube-system(44594c6e-aa9f-4fbd-9913-db1bedcc1034)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-t7b9x_kube-system(44594c6e-aa9f-4fbd-9913-db1bedcc1034)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-t7b9x" podUID="44594c6e-aa9f-4fbd-9913-db1bedcc1034" May 13 00:25:45.295782 kubelet[2476]: E0513 00:25:45.295384 2476 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.295782 kubelet[2476]: E0513 00:25:45.295491 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-xgblg_calico-system(93974201-6dc9-4f10-8077-3a1e2417dccb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-xgblg_calico-system(93974201-6dc9-4f10-8077-3a1e2417dccb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xgblg" podUID="93974201-6dc9-4f10-8077-3a1e2417dccb" May 13 00:25:45.295782 kubelet[2476]: E0513 00:25:45.295631 2476 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-79ff8699d-dbkcf" May 13 00:25:45.296495 kubelet[2476]: E0513 00:25:45.295492 2476 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9bb7cb75b-6pgjr" May 13 00:25:45.296495 kubelet[2476]: E0513 00:25:45.295645 2476 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-79ff8699d-dbkcf" May 13 00:25:45.296495 kubelet[2476]: E0513 00:25:45.295649 2476 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9bb7cb75b-6pgjr" May 13 00:25:45.296609 kubelet[2476]: E0513 00:25:45.295669 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-79ff8699d-dbkcf_calico-system(17288722-b2a9-43e7-9881-20dbd0f1ae77)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-79ff8699d-dbkcf_calico-system(17288722-b2a9-43e7-9881-20dbd0f1ae77)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-79ff8699d-dbkcf" podUID="17288722-b2a9-43e7-9881-20dbd0f1ae77" May 13 00:25:45.296609 kubelet[2476]: E0513 00:25:45.295677 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9bb7cb75b-6pgjr_calico-apiserver(48db1284-6ae2-4942-9a9b-c26e0e5a874e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9bb7cb75b-6pgjr_calico-apiserver(48db1284-6ae2-4942-9a9b-c26e0e5a874e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9bb7cb75b-6pgjr" podUID="48db1284-6ae2-4942-9a9b-c26e0e5a874e" May 13 00:25:45.296609 kubelet[2476]: E0513 00:25:45.295956 2476 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.296770 kubelet[2476]: E0513 00:25:45.296090 2476 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-tclg9" May 13 00:25:45.296770 kubelet[2476]: E0513 00:25:45.296118 2476 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-tclg9" May 13 00:25:45.296770 kubelet[2476]: E0513 00:25:45.296189 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-tclg9_kube-system(7de40860-c4d9-46ea-8cfc-0e1375b1cb33)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-tclg9_kube-system(7de40860-c4d9-46ea-8cfc-0e1375b1cb33)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-tclg9" podUID="7de40860-c4d9-46ea-8cfc-0e1375b1cb33" May 13 00:25:45.311155 containerd[1455]: time="2025-05-13T00:25:45.311101175Z" level=error msg="Failed to destroy network for sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.311527 containerd[1455]: time="2025-05-13T00:25:45.311488446Z" level=error msg="encountered an error cleaning up failed sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.311624 containerd[1455]: time="2025-05-13T00:25:45.311557095Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb7cb75b-h6m65,Uid:3654df1f-4f7d-4d28-8abd-dfe02b50107c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.311761 kubelet[2476]: E0513 00:25:45.311717 2476 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.311821 kubelet[2476]: E0513 00:25:45.311766 2476 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9bb7cb75b-h6m65" May 13 00:25:45.311821 kubelet[2476]: E0513 00:25:45.311784 2476 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9bb7cb75b-h6m65" May 13 00:25:45.311901 kubelet[2476]: E0513 00:25:45.311818 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9bb7cb75b-h6m65_calico-apiserver(3654df1f-4f7d-4d28-8abd-dfe02b50107c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9bb7cb75b-h6m65_calico-apiserver(3654df1f-4f7d-4d28-8abd-dfe02b50107c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9bb7cb75b-h6m65" podUID="3654df1f-4f7d-4d28-8abd-dfe02b50107c" May 13 00:25:45.837416 kubelet[2476]: I0513 00:25:45.837379 2476 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:25:45.839869 kubelet[2476]: E0513 00:25:45.839728 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:45.840457 kubelet[2476]: I0513 00:25:45.840415 2476 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:25:45.840863 containerd[1455]: time="2025-05-13T00:25:45.840830503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 13 00:25:45.841402 containerd[1455]: time="2025-05-13T00:25:45.841361154Z" level=info msg="StopPodSandbox for \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\"" May 13 00:25:45.841525 containerd[1455]: time="2025-05-13T00:25:45.841505767Z" level=info msg="Ensure that sandbox fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a in task-service has been cleanup successfully" May 13 00:25:45.842891 containerd[1455]: time="2025-05-13T00:25:45.841678704Z" level=info msg="StopPodSandbox for \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\"" May 13 00:25:45.842891 containerd[1455]: time="2025-05-13T00:25:45.841833396Z" level=info msg="Ensure that sandbox 1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78 in task-service has been cleanup successfully" May 13 00:25:45.846245 kubelet[2476]: I0513 00:25:45.845216 2476 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:25:45.848247 kubelet[2476]: I0513 00:25:45.848188 2476 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:25:45.848942 containerd[1455]: time="2025-05-13T00:25:45.848688998Z" level=info msg="StopPodSandbox for \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\"" May 13 00:25:45.848942 containerd[1455]: time="2025-05-13T00:25:45.848862325Z" level=info msg="Ensure that sandbox e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890 in task-service has been cleanup successfully" May 13 00:25:45.849873 containerd[1455]: time="2025-05-13T00:25:45.849484039Z" level=info msg="StopPodSandbox for \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\"" May 13 00:25:45.849873 containerd[1455]: time="2025-05-13T00:25:45.849639072Z" level=info msg="Ensure that sandbox e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f in task-service has been cleanup successfully" May 13 00:25:45.850856 kubelet[2476]: I0513 00:25:45.850705 2476 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:25:45.855524 containerd[1455]: time="2025-05-13T00:25:45.855481101Z" level=info msg="StopPodSandbox for \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\"" May 13 00:25:45.855811 containerd[1455]: time="2025-05-13T00:25:45.855665208Z" level=info msg="Ensure that sandbox 4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5 in task-service has been cleanup successfully" May 13 00:25:45.860795 kubelet[2476]: I0513 00:25:45.860339 2476 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:25:45.860882 containerd[1455]: time="2025-05-13T00:25:45.860850427Z" level=info msg="StopPodSandbox for \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\"" May 13 00:25:45.861602 containerd[1455]: time="2025-05-13T00:25:45.861433508Z" level=info msg="Ensure that sandbox 510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25 in task-service has been cleanup successfully" May 13 00:25:45.895349 containerd[1455]: time="2025-05-13T00:25:45.895286767Z" level=error msg="StopPodSandbox for \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\" failed" error="failed to destroy network for sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.895828 kubelet[2476]: E0513 00:25:45.895624 2476 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:25:45.895828 kubelet[2476]: E0513 00:25:45.895712 2476 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a"} May 13 00:25:45.895828 kubelet[2476]: E0513 00:25:45.895773 2476 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"44594c6e-aa9f-4fbd-9913-db1bedcc1034\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" May 13 00:25:45.895828 kubelet[2476]: E0513 00:25:45.895800 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"44594c6e-aa9f-4fbd-9913-db1bedcc1034\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-t7b9x" podUID="44594c6e-aa9f-4fbd-9913-db1bedcc1034" May 13 00:25:45.896233 containerd[1455]: time="2025-05-13T00:25:45.896198558Z" level=error msg="StopPodSandbox for \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\" failed" error="failed to destroy network for sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.896407 kubelet[2476]: E0513 00:25:45.896389 2476 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:25:45.896488 kubelet[2476]: E0513 00:25:45.896476 2476 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78"} May 13 00:25:45.896590 kubelet[2476]: E0513 00:25:45.896552 2476 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"17288722-b2a9-43e7-9881-20dbd0f1ae77\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" May 13 00:25:45.896590 kubelet[2476]: E0513 00:25:45.896571 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"17288722-b2a9-43e7-9881-20dbd0f1ae77\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-79ff8699d-dbkcf" podUID="17288722-b2a9-43e7-9881-20dbd0f1ae77" May 13 00:25:45.903715 containerd[1455]: time="2025-05-13T00:25:45.903686053Z" level=error msg="StopPodSandbox for \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\" failed" error="failed to destroy network for sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.903930 kubelet[2476]: E0513 00:25:45.903908 2476 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:25:45.904562 kubelet[2476]: E0513 00:25:45.904495 2476 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890"} May 13 00:25:45.904607 kubelet[2476]: E0513 00:25:45.904574 2476 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3654df1f-4f7d-4d28-8abd-dfe02b50107c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" May 13 00:25:45.904663 kubelet[2476]: E0513 00:25:45.904601 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3654df1f-4f7d-4d28-8abd-dfe02b50107c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9bb7cb75b-h6m65" podUID="3654df1f-4f7d-4d28-8abd-dfe02b50107c" May 13 00:25:45.906814 containerd[1455]: time="2025-05-13T00:25:45.906772019Z" level=error msg="StopPodSandbox for \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\" failed" error="failed to destroy network for sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.906979 kubelet[2476]: E0513 00:25:45.906952 2476 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:25:45.907016 kubelet[2476]: E0513 00:25:45.906981 2476 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5"} May 13 00:25:45.907016 kubelet[2476]: E0513 00:25:45.907002 2476 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7de40860-c4d9-46ea-8cfc-0e1375b1cb33\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" May 13 00:25:45.907087 kubelet[2476]: E0513 00:25:45.907018 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7de40860-c4d9-46ea-8cfc-0e1375b1cb33\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-tclg9" podUID="7de40860-c4d9-46ea-8cfc-0e1375b1cb33" May 13 00:25:45.907699 containerd[1455]: time="2025-05-13T00:25:45.907662710Z" level=error msg="StopPodSandbox for \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\" failed" error="failed to destroy network for sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.907861 kubelet[2476]: E0513 00:25:45.907809 2476 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:25:45.907895 kubelet[2476]: E0513 00:25:45.907858 2476 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25"} May 13 00:25:45.907895 kubelet[2476]: E0513 00:25:45.907878 2476 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"93974201-6dc9-4f10-8077-3a1e2417dccb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" May 13 00:25:45.907970 kubelet[2476]: E0513 00:25:45.907926 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"93974201-6dc9-4f10-8077-3a1e2417dccb\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xgblg" podUID="93974201-6dc9-4f10-8077-3a1e2417dccb" May 13 00:25:45.910637 containerd[1455]: time="2025-05-13T00:25:45.910604594Z" level=error msg="StopPodSandbox for \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\" failed" error="failed to destroy network for sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 00:25:45.910758 kubelet[2476]: E0513 00:25:45.910732 2476 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:25:45.910810 kubelet[2476]: E0513 00:25:45.910762 2476 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f"} May 13 00:25:45.910833 kubelet[2476]: E0513 00:25:45.910810 2476 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"48db1284-6ae2-4942-9a9b-c26e0e5a874e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" May 13 00:25:45.910878 kubelet[2476]: E0513 00:25:45.910834 2476 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"48db1284-6ae2-4942-9a9b-c26e0e5a874e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9bb7cb75b-6pgjr" podUID="48db1284-6ae2-4942-9a9b-c26e0e5a874e" May 13 00:25:49.026865 systemd[1]: Started sshd@8-10.0.0.89:22-10.0.0.1:54186.service - OpenSSH per-connection server daemon (10.0.0.1:54186). May 13 00:25:49.059577 sshd[3807]: Accepted publickey for core from 10.0.0.1 port 54186 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:25:49.061443 sshd[3807]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:25:49.067160 systemd-logind[1436]: New session 9 of user core. May 13 00:25:49.073708 systemd[1]: Started session-9.scope - Session 9 of User core. May 13 00:25:49.192459 sshd[3807]: pam_unix(sshd:session): session closed for user core May 13 00:25:49.196713 systemd[1]: sshd@8-10.0.0.89:22-10.0.0.1:54186.service: Deactivated successfully. May 13 00:25:49.198827 systemd[1]: session-9.scope: Deactivated successfully. May 13 00:25:49.199483 systemd-logind[1436]: Session 9 logged out. Waiting for processes to exit. May 13 00:25:49.200634 systemd-logind[1436]: Removed session 9. May 13 00:25:52.756418 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount230592732.mount: Deactivated successfully. May 13 00:25:54.205339 systemd[1]: Started sshd@9-10.0.0.89:22-10.0.0.1:54188.service - OpenSSH per-connection server daemon (10.0.0.1:54188). May 13 00:25:54.620367 containerd[1455]: time="2025-05-13T00:25:54.620296798Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:54.621701 containerd[1455]: time="2025-05-13T00:25:54.621630528Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 13 00:25:54.623487 containerd[1455]: time="2025-05-13T00:25:54.623449674Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:54.625949 containerd[1455]: time="2025-05-13T00:25:54.625897210Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:25:54.626835 containerd[1455]: time="2025-05-13T00:25:54.626765946Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 8.785860432s" May 13 00:25:54.626835 containerd[1455]: time="2025-05-13T00:25:54.626831831Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 13 00:25:54.636626 sshd[3826]: Accepted publickey for core from 10.0.0.1 port 54188 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:25:54.638450 containerd[1455]: time="2025-05-13T00:25:54.637632152Z" level=info msg="CreateContainer within sandbox \"7d3c2babe03c3654643def88a1435ab533bcc25f8658501e8e3ccab310e64d6f\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 00:25:54.639298 sshd[3826]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:25:54.646203 systemd-logind[1436]: New session 10 of user core. May 13 00:25:54.654709 systemd[1]: Started session-10.scope - Session 10 of User core. May 13 00:25:54.658344 containerd[1455]: time="2025-05-13T00:25:54.658300760Z" level=info msg="CreateContainer within sandbox \"7d3c2babe03c3654643def88a1435ab533bcc25f8658501e8e3ccab310e64d6f\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"3c6e429f71affb0490b0c4cbd636bb67b1fdce40541a26b76cc04e1144dc45e0\"" May 13 00:25:54.659212 containerd[1455]: time="2025-05-13T00:25:54.659051032Z" level=info msg="StartContainer for \"3c6e429f71affb0490b0c4cbd636bb67b1fdce40541a26b76cc04e1144dc45e0\"" May 13 00:25:54.741696 systemd[1]: Started cri-containerd-3c6e429f71affb0490b0c4cbd636bb67b1fdce40541a26b76cc04e1144dc45e0.scope - libcontainer container 3c6e429f71affb0490b0c4cbd636bb67b1fdce40541a26b76cc04e1144dc45e0. May 13 00:25:54.774139 containerd[1455]: time="2025-05-13T00:25:54.774084071Z" level=info msg="StartContainer for \"3c6e429f71affb0490b0c4cbd636bb67b1fdce40541a26b76cc04e1144dc45e0\" returns successfully" May 13 00:25:54.779397 sshd[3826]: pam_unix(sshd:session): session closed for user core May 13 00:25:54.783965 systemd[1]: sshd@9-10.0.0.89:22-10.0.0.1:54188.service: Deactivated successfully. May 13 00:25:54.786845 systemd[1]: session-10.scope: Deactivated successfully. May 13 00:25:54.787562 systemd-logind[1436]: Session 10 logged out. Waiting for processes to exit. May 13 00:25:54.788412 systemd-logind[1436]: Removed session 10. May 13 00:25:54.838297 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 13 00:25:54.838395 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 13 00:25:54.879906 kubelet[2476]: E0513 00:25:54.879057 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:55.087426 kubelet[2476]: I0513 00:25:55.087362 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-lj45c" podStartSLOduration=1.302432093 podStartE2EDuration="24.087346048s" podCreationTimestamp="2025-05-13 00:25:31 +0000 UTC" firstStartedPulling="2025-05-13 00:25:31.842764308 +0000 UTC m=+13.358344155" lastFinishedPulling="2025-05-13 00:25:54.627678263 +0000 UTC m=+36.143258110" observedRunningTime="2025-05-13 00:25:55.086301712 +0000 UTC m=+36.601881579" watchObservedRunningTime="2025-05-13 00:25:55.087346048 +0000 UTC m=+36.602925895" May 13 00:25:55.880464 kubelet[2476]: E0513 00:25:55.880409 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:56.563500 containerd[1455]: time="2025-05-13T00:25:56.563390090Z" level=info msg="StopPodSandbox for \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\"" May 13 00:25:56.565118 containerd[1455]: time="2025-05-13T00:25:56.564955997Z" level=info msg="StopPodSandbox for \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\"" May 13 00:25:56.746585 kernel: bpftool[4149]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.646 [INFO][4054] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.646 [INFO][4054] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" iface="eth0" netns="/var/run/netns/cni-7afa01e0-ebc5-cde8-aeb2-f1fe35e4229c" May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.646 [INFO][4054] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" iface="eth0" netns="/var/run/netns/cni-7afa01e0-ebc5-cde8-aeb2-f1fe35e4229c" May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.647 [INFO][4054] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" iface="eth0" netns="/var/run/netns/cni-7afa01e0-ebc5-cde8-aeb2-f1fe35e4229c" May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.647 [INFO][4054] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.647 [INFO][4054] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.731 [INFO][4097] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" HandleID="k8s-pod-network.4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.733 [INFO][4097] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.733 [INFO][4097] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.743 [WARNING][4097] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" HandleID="k8s-pod-network.4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.743 [INFO][4097] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" HandleID="k8s-pod-network.4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.744 [INFO][4097] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:25:56.751559 containerd[1455]: 2025-05-13 00:25:56.748 [INFO][4054] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:25:56.754412 systemd[1]: run-netns-cni\x2d7afa01e0\x2debc5\x2dcde8\x2daeb2\x2df1fe35e4229c.mount: Deactivated successfully. May 13 00:25:56.755848 containerd[1455]: time="2025-05-13T00:25:56.754960913Z" level=info msg="TearDown network for sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\" successfully" May 13 00:25:56.755848 containerd[1455]: time="2025-05-13T00:25:56.755026446Z" level=info msg="StopPodSandbox for \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\" returns successfully" May 13 00:25:56.756091 kubelet[2476]: E0513 00:25:56.755927 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:56.758266 containerd[1455]: time="2025-05-13T00:25:56.756874333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tclg9,Uid:7de40860-c4d9-46ea-8cfc-0e1375b1cb33,Namespace:kube-system,Attempt:1,}" May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.658 [INFO][4060] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.658 [INFO][4060] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" iface="eth0" netns="/var/run/netns/cni-90016564-d55d-6b05-194d-5cbca2ec86b7" May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.659 [INFO][4060] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" iface="eth0" netns="/var/run/netns/cni-90016564-d55d-6b05-194d-5cbca2ec86b7" May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.659 [INFO][4060] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" iface="eth0" netns="/var/run/netns/cni-90016564-d55d-6b05-194d-5cbca2ec86b7" May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.659 [INFO][4060] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.659 [INFO][4060] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.732 [INFO][4103] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" HandleID="k8s-pod-network.e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.732 [INFO][4103] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.744 [INFO][4103] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.752 [WARNING][4103] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" HandleID="k8s-pod-network.e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.753 [INFO][4103] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" HandleID="k8s-pod-network.e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.755 [INFO][4103] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:25:56.763606 containerd[1455]: 2025-05-13 00:25:56.760 [INFO][4060] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:25:56.766284 containerd[1455]: time="2025-05-13T00:25:56.763760402Z" level=info msg="TearDown network for sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\" successfully" May 13 00:25:56.766284 containerd[1455]: time="2025-05-13T00:25:56.763787763Z" level=info msg="StopPodSandbox for \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\" returns successfully" May 13 00:25:56.766284 containerd[1455]: time="2025-05-13T00:25:56.766037005Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb7cb75b-6pgjr,Uid:48db1284-6ae2-4942-9a9b-c26e0e5a874e,Namespace:calico-apiserver,Attempt:1,}" May 13 00:25:56.766764 systemd[1]: run-netns-cni\x2d90016564\x2dd55d\x2d6b05\x2d194d\x2d5cbca2ec86b7.mount: Deactivated successfully. May 13 00:25:57.168832 systemd-networkd[1385]: vxlan.calico: Link UP May 13 00:25:57.168842 systemd-networkd[1385]: vxlan.calico: Gained carrier May 13 00:25:57.332282 systemd-networkd[1385]: cali3bdf2d9dae4: Link UP May 13 00:25:57.332881 systemd-networkd[1385]: cali3bdf2d9dae4: Gained carrier May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.233 [INFO][4198] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0 calico-apiserver-9bb7cb75b- calico-apiserver 48db1284-6ae2-4942-9a9b-c26e0e5a874e 843 0 2025-05-13 00:25:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9bb7cb75b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-9bb7cb75b-6pgjr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3bdf2d9dae4 [] []}} ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-6pgjr" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.233 [INFO][4198] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-6pgjr" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.274 [INFO][4225] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" HandleID="k8s-pod-network.1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.286 [INFO][4225] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" HandleID="k8s-pod-network.1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000312fc0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-9bb7cb75b-6pgjr", "timestamp":"2025-05-13 00:25:57.274099579 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.286 [INFO][4225] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.286 [INFO][4225] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.286 [INFO][4225] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.288 [INFO][4225] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" host="localhost" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.294 [INFO][4225] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.300 [INFO][4225] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.302 [INFO][4225] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.306 [INFO][4225] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.306 [INFO][4225] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" host="localhost" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.310 [INFO][4225] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.314 [INFO][4225] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" host="localhost" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.323 [INFO][4225] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" host="localhost" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.323 [INFO][4225] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" host="localhost" May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.323 [INFO][4225] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:25:57.345027 containerd[1455]: 2025-05-13 00:25:57.323 [INFO][4225] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" HandleID="k8s-pod-network.1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:25:57.345592 containerd[1455]: 2025-05-13 00:25:57.329 [INFO][4198] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-6pgjr" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0", GenerateName:"calico-apiserver-9bb7cb75b-", Namespace:"calico-apiserver", SelfLink:"", UID:"48db1284-6ae2-4942-9a9b-c26e0e5a874e", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bb7cb75b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-9bb7cb75b-6pgjr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3bdf2d9dae4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:25:57.345592 containerd[1455]: 2025-05-13 00:25:57.329 [INFO][4198] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-6pgjr" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:25:57.345592 containerd[1455]: 2025-05-13 00:25:57.329 [INFO][4198] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3bdf2d9dae4 ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-6pgjr" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:25:57.345592 containerd[1455]: 2025-05-13 00:25:57.333 [INFO][4198] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-6pgjr" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:25:57.345592 containerd[1455]: 2025-05-13 00:25:57.333 [INFO][4198] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-6pgjr" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0", GenerateName:"calico-apiserver-9bb7cb75b-", Namespace:"calico-apiserver", SelfLink:"", UID:"48db1284-6ae2-4942-9a9b-c26e0e5a874e", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bb7cb75b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f", Pod:"calico-apiserver-9bb7cb75b-6pgjr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3bdf2d9dae4", MAC:"0a:25:80:41:a3:b4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:25:57.345592 containerd[1455]: 2025-05-13 00:25:57.340 [INFO][4198] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-6pgjr" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:25:57.374932 containerd[1455]: time="2025-05-13T00:25:57.374744452Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:25:57.374932 containerd[1455]: time="2025-05-13T00:25:57.374797041Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:25:57.374932 containerd[1455]: time="2025-05-13T00:25:57.374817921Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:57.374932 containerd[1455]: time="2025-05-13T00:25:57.374900927Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:57.392655 systemd[1]: Started cri-containerd-1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f.scope - libcontainer container 1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f. May 13 00:25:57.409015 systemd-resolved[1327]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 00:25:57.419781 systemd-networkd[1385]: cali27a5de7679e: Link UP May 13 00:25:57.419966 systemd-networkd[1385]: cali27a5de7679e: Gained carrier May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.233 [INFO][4191] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--tclg9-eth0 coredns-668d6bf9bc- kube-system 7de40860-c4d9-46ea-8cfc-0e1375b1cb33 842 0 2025-05-13 00:25:24 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-tclg9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali27a5de7679e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Namespace="kube-system" Pod="coredns-668d6bf9bc-tclg9" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tclg9-" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.233 [INFO][4191] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Namespace="kube-system" Pod="coredns-668d6bf9bc-tclg9" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.276 [INFO][4218] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" HandleID="k8s-pod-network.c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.293 [INFO][4218] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" HandleID="k8s-pod-network.c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003aa890), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-tclg9", "timestamp":"2025-05-13 00:25:57.276256698 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.293 [INFO][4218] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.323 [INFO][4218] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.323 [INFO][4218] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.389 [INFO][4218] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" host="localhost" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.394 [INFO][4218] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.399 [INFO][4218] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.401 [INFO][4218] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.403 [INFO][4218] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.403 [INFO][4218] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" host="localhost" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.404 [INFO][4218] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111 May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.408 [INFO][4218] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" host="localhost" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.414 [INFO][4218] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" host="localhost" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.414 [INFO][4218] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" host="localhost" May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.414 [INFO][4218] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:25:57.436460 containerd[1455]: 2025-05-13 00:25:57.414 [INFO][4218] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" HandleID="k8s-pod-network.c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:25:57.437017 containerd[1455]: 2025-05-13 00:25:57.417 [INFO][4191] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Namespace="kube-system" Pod="coredns-668d6bf9bc-tclg9" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--tclg9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7de40860-c4d9-46ea-8cfc-0e1375b1cb33", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-tclg9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali27a5de7679e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:25:57.437017 containerd[1455]: 2025-05-13 00:25:57.417 [INFO][4191] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Namespace="kube-system" Pod="coredns-668d6bf9bc-tclg9" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:25:57.437017 containerd[1455]: 2025-05-13 00:25:57.417 [INFO][4191] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali27a5de7679e ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Namespace="kube-system" Pod="coredns-668d6bf9bc-tclg9" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:25:57.437017 containerd[1455]: 2025-05-13 00:25:57.419 [INFO][4191] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Namespace="kube-system" Pod="coredns-668d6bf9bc-tclg9" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:25:57.437017 containerd[1455]: 2025-05-13 00:25:57.419 [INFO][4191] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Namespace="kube-system" Pod="coredns-668d6bf9bc-tclg9" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--tclg9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7de40860-c4d9-46ea-8cfc-0e1375b1cb33", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111", Pod:"coredns-668d6bf9bc-tclg9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali27a5de7679e", MAC:"46:73:1f:d6:36:f2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:25:57.437017 containerd[1455]: 2025-05-13 00:25:57.430 [INFO][4191] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111" Namespace="kube-system" Pod="coredns-668d6bf9bc-tclg9" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:25:57.443316 containerd[1455]: time="2025-05-13T00:25:57.443287914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb7cb75b-6pgjr,Uid:48db1284-6ae2-4942-9a9b-c26e0e5a874e,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f\"" May 13 00:25:57.445960 containerd[1455]: time="2025-05-13T00:25:57.445925545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 00:25:57.467421 containerd[1455]: time="2025-05-13T00:25:57.467334552Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:25:57.467421 containerd[1455]: time="2025-05-13T00:25:57.467383976Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:25:57.467421 containerd[1455]: time="2025-05-13T00:25:57.467409023Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:57.467700 containerd[1455]: time="2025-05-13T00:25:57.467599070Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:57.491699 systemd[1]: Started cri-containerd-c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111.scope - libcontainer container c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111. May 13 00:25:57.504879 systemd-resolved[1327]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 00:25:57.530444 containerd[1455]: time="2025-05-13T00:25:57.530396751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tclg9,Uid:7de40860-c4d9-46ea-8cfc-0e1375b1cb33,Namespace:kube-system,Attempt:1,} returns sandbox id \"c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111\"" May 13 00:25:57.531212 kubelet[2476]: E0513 00:25:57.531183 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:57.533223 containerd[1455]: time="2025-05-13T00:25:57.533178946Z" level=info msg="CreateContainer within sandbox \"c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 00:25:57.548330 containerd[1455]: time="2025-05-13T00:25:57.548289002Z" level=info msg="CreateContainer within sandbox \"c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9ce9f14892dae4795905d95003df2f7d8cce6267a00afc7ab946511ad13281ed\"" May 13 00:25:57.548708 containerd[1455]: time="2025-05-13T00:25:57.548680197Z" level=info msg="StartContainer for \"9ce9f14892dae4795905d95003df2f7d8cce6267a00afc7ab946511ad13281ed\"" May 13 00:25:57.560927 containerd[1455]: time="2025-05-13T00:25:57.560826218Z" level=info msg="StopPodSandbox for \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\"" May 13 00:25:57.572678 systemd[1]: Started cri-containerd-9ce9f14892dae4795905d95003df2f7d8cce6267a00afc7ab946511ad13281ed.scope - libcontainer container 9ce9f14892dae4795905d95003df2f7d8cce6267a00afc7ab946511ad13281ed. May 13 00:25:57.603102 containerd[1455]: time="2025-05-13T00:25:57.603061656Z" level=info msg="StartContainer for \"9ce9f14892dae4795905d95003df2f7d8cce6267a00afc7ab946511ad13281ed\" returns successfully" May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.606 [INFO][4409] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.606 [INFO][4409] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" iface="eth0" netns="/var/run/netns/cni-15b0290e-c432-e2a0-899e-0f9b289a41ac" May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.606 [INFO][4409] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" iface="eth0" netns="/var/run/netns/cni-15b0290e-c432-e2a0-899e-0f9b289a41ac" May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.606 [INFO][4409] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" iface="eth0" netns="/var/run/netns/cni-15b0290e-c432-e2a0-899e-0f9b289a41ac" May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.606 [INFO][4409] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.606 [INFO][4409] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.631 [INFO][4430] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" HandleID="k8s-pod-network.1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.631 [INFO][4430] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.631 [INFO][4430] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.636 [WARNING][4430] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" HandleID="k8s-pod-network.1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.636 [INFO][4430] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" HandleID="k8s-pod-network.1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.637 [INFO][4430] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:25:57.644053 containerd[1455]: 2025-05-13 00:25:57.641 [INFO][4409] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:25:57.644516 containerd[1455]: time="2025-05-13T00:25:57.644207243Z" level=info msg="TearDown network for sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\" successfully" May 13 00:25:57.644516 containerd[1455]: time="2025-05-13T00:25:57.644233572Z" level=info msg="StopPodSandbox for \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\" returns successfully" May 13 00:25:57.644990 containerd[1455]: time="2025-05-13T00:25:57.644946263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79ff8699d-dbkcf,Uid:17288722-b2a9-43e7-9881-20dbd0f1ae77,Namespace:calico-system,Attempt:1,}" May 13 00:25:57.758833 systemd[1]: run-netns-cni\x2d15b0290e\x2dc432\x2de2a0\x2d899e\x2d0f9b289a41ac.mount: Deactivated successfully. May 13 00:25:57.777822 systemd-networkd[1385]: cali0948978a615: Link UP May 13 00:25:57.779103 systemd-networkd[1385]: cali0948978a615: Gained carrier May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.717 [INFO][4444] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0 calico-kube-controllers-79ff8699d- calico-system 17288722-b2a9-43e7-9881-20dbd0f1ae77 863 0 2025-05-13 00:25:31 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:79ff8699d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-79ff8699d-dbkcf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0948978a615 [] []}} ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Namespace="calico-system" Pod="calico-kube-controllers-79ff8699d-dbkcf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.717 [INFO][4444] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Namespace="calico-system" Pod="calico-kube-controllers-79ff8699d-dbkcf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.745 [INFO][4461] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" HandleID="k8s-pod-network.9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.752 [INFO][4461] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" HandleID="k8s-pod-network.9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003b3960), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-79ff8699d-dbkcf", "timestamp":"2025-05-13 00:25:57.74510971 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.752 [INFO][4461] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.752 [INFO][4461] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.752 [INFO][4461] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.754 [INFO][4461] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" host="localhost" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.757 [INFO][4461] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.760 [INFO][4461] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.761 [INFO][4461] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.763 [INFO][4461] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.763 [INFO][4461] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" host="localhost" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.764 [INFO][4461] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0 May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.767 [INFO][4461] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" host="localhost" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.772 [INFO][4461] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" host="localhost" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.772 [INFO][4461] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" host="localhost" May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.772 [INFO][4461] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:25:57.788570 containerd[1455]: 2025-05-13 00:25:57.772 [INFO][4461] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" HandleID="k8s-pod-network.9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:25:57.789075 containerd[1455]: 2025-05-13 00:25:57.775 [INFO][4444] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Namespace="calico-system" Pod="calico-kube-controllers-79ff8699d-dbkcf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0", GenerateName:"calico-kube-controllers-79ff8699d-", Namespace:"calico-system", SelfLink:"", UID:"17288722-b2a9-43e7-9881-20dbd0f1ae77", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"79ff8699d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-79ff8699d-dbkcf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0948978a615", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:25:57.789075 containerd[1455]: 2025-05-13 00:25:57.775 [INFO][4444] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Namespace="calico-system" Pod="calico-kube-controllers-79ff8699d-dbkcf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:25:57.789075 containerd[1455]: 2025-05-13 00:25:57.775 [INFO][4444] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0948978a615 ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Namespace="calico-system" Pod="calico-kube-controllers-79ff8699d-dbkcf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:25:57.789075 containerd[1455]: 2025-05-13 00:25:57.777 [INFO][4444] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Namespace="calico-system" Pod="calico-kube-controllers-79ff8699d-dbkcf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:25:57.789075 containerd[1455]: 2025-05-13 00:25:57.778 [INFO][4444] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Namespace="calico-system" Pod="calico-kube-controllers-79ff8699d-dbkcf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0", GenerateName:"calico-kube-controllers-79ff8699d-", Namespace:"calico-system", SelfLink:"", UID:"17288722-b2a9-43e7-9881-20dbd0f1ae77", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"79ff8699d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0", Pod:"calico-kube-controllers-79ff8699d-dbkcf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0948978a615", MAC:"46:af:35:19:e4:77", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:25:57.789075 containerd[1455]: 2025-05-13 00:25:57.785 [INFO][4444] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0" Namespace="calico-system" Pod="calico-kube-controllers-79ff8699d-dbkcf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:25:57.810090 containerd[1455]: time="2025-05-13T00:25:57.809968669Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:25:57.810090 containerd[1455]: time="2025-05-13T00:25:57.810047428Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:25:57.810090 containerd[1455]: time="2025-05-13T00:25:57.810061634Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:57.810444 containerd[1455]: time="2025-05-13T00:25:57.810170709Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:25:57.831858 systemd[1]: Started cri-containerd-9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0.scope - libcontainer container 9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0. May 13 00:25:57.844045 systemd-resolved[1327]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 00:25:57.869221 containerd[1455]: time="2025-05-13T00:25:57.869169143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79ff8699d-dbkcf,Uid:17288722-b2a9-43e7-9881-20dbd0f1ae77,Namespace:calico-system,Attempt:1,} returns sandbox id \"9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0\"" May 13 00:25:57.887731 kubelet[2476]: E0513 00:25:57.886875 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:57.955680 kubelet[2476]: I0513 00:25:57.955617 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-tclg9" podStartSLOduration=33.955600085 podStartE2EDuration="33.955600085s" podCreationTimestamp="2025-05-13 00:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 00:25:57.953618157 +0000 UTC m=+39.469198004" watchObservedRunningTime="2025-05-13 00:25:57.955600085 +0000 UTC m=+39.471179932" May 13 00:25:58.633707 systemd-networkd[1385]: cali27a5de7679e: Gained IPv6LL May 13 00:25:58.761797 systemd-networkd[1385]: vxlan.calico: Gained IPv6LL May 13 00:25:58.891217 kubelet[2476]: E0513 00:25:58.891091 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:25:59.209720 systemd-networkd[1385]: cali0948978a615: Gained IPv6LL May 13 00:25:59.273784 systemd-networkd[1385]: cali3bdf2d9dae4: Gained IPv6LL May 13 00:25:59.562008 containerd[1455]: time="2025-05-13T00:25:59.561902862Z" level=info msg="StopPodSandbox for \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\"" May 13 00:25:59.562666 containerd[1455]: time="2025-05-13T00:25:59.562076749Z" level=info msg="StopPodSandbox for \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\"" May 13 00:25:59.562666 containerd[1455]: time="2025-05-13T00:25:59.562136661Z" level=info msg="StopPodSandbox for \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\"" May 13 00:25:59.791672 systemd[1]: Started sshd@10-10.0.0.89:22-10.0.0.1:54098.service - OpenSSH per-connection server daemon (10.0.0.1:54098). May 13 00:25:59.892861 kubelet[2476]: E0513 00:25:59.892741 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:26:00.005527 sshd[4603]: Accepted publickey for core from 10.0.0.1 port 54098 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.795 [INFO][4576] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.795 [INFO][4576] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" iface="eth0" netns="/var/run/netns/cni-0cff0f3c-83e5-e1b1-6659-76839e93b6ac" May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.796 [INFO][4576] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" iface="eth0" netns="/var/run/netns/cni-0cff0f3c-83e5-e1b1-6659-76839e93b6ac" May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.796 [INFO][4576] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" iface="eth0" netns="/var/run/netns/cni-0cff0f3c-83e5-e1b1-6659-76839e93b6ac" May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.796 [INFO][4576] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.796 [INFO][4576] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.983 [INFO][4605] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" HandleID="k8s-pod-network.510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.985 [INFO][4605] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.985 [INFO][4605] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.993 [WARNING][4605] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" HandleID="k8s-pod-network.510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.994 [INFO][4605] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" HandleID="k8s-pod-network.510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:25:59.997 [INFO][4605] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:00.006000 containerd[1455]: 2025-05-13 00:26:00.001 [INFO][4576] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:00.006740 containerd[1455]: time="2025-05-13T00:26:00.006662536Z" level=info msg="TearDown network for sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\" successfully" May 13 00:26:00.006740 containerd[1455]: time="2025-05-13T00:26:00.006696680Z" level=info msg="StopPodSandbox for \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\" returns successfully" May 13 00:26:00.007507 containerd[1455]: time="2025-05-13T00:26:00.007489289Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xgblg,Uid:93974201-6dc9-4f10-8077-3a1e2417dccb,Namespace:calico-system,Attempt:1,}" May 13 00:26:00.009473 sshd[4603]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:00.010138 systemd[1]: run-netns-cni\x2d0cff0f3c\x2d83e5\x2de1b1\x2d6659\x2d76839e93b6ac.mount: Deactivated successfully. May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:25:59.950 [INFO][4582] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:25:59.950 [INFO][4582] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" iface="eth0" netns="/var/run/netns/cni-89e4097f-859d-28f7-c878-4af7b701f9cc" May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:25:59.950 [INFO][4582] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" iface="eth0" netns="/var/run/netns/cni-89e4097f-859d-28f7-c878-4af7b701f9cc" May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:25:59.950 [INFO][4582] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" iface="eth0" netns="/var/run/netns/cni-89e4097f-859d-28f7-c878-4af7b701f9cc" May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:25:59.950 [INFO][4582] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:25:59.950 [INFO][4582] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:25:59.997 [INFO][4608] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" HandleID="k8s-pod-network.fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:25:59.997 [INFO][4608] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:25:59.997 [INFO][4608] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:26:00.002 [WARNING][4608] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" HandleID="k8s-pod-network.fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:26:00.002 [INFO][4608] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" HandleID="k8s-pod-network.fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:26:00.005 [INFO][4608] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:00.014086 containerd[1455]: 2025-05-13 00:26:00.010 [INFO][4582] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:00.017272 containerd[1455]: time="2025-05-13T00:26:00.014615008Z" level=info msg="TearDown network for sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\" successfully" May 13 00:26:00.017272 containerd[1455]: time="2025-05-13T00:26:00.014656937Z" level=info msg="StopPodSandbox for \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\" returns successfully" May 13 00:26:00.016750 systemd[1]: run-netns-cni\x2d89e4097f\x2d859d\x2d28f7\x2dc878\x2d4af7b701f9cc.mount: Deactivated successfully. May 13 00:26:00.017440 kubelet[2476]: E0513 00:26:00.015403 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:26:00.022509 systemd-logind[1436]: New session 11 of user core. May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:25:59.960 [INFO][4580] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:25:59.960 [INFO][4580] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" iface="eth0" netns="/var/run/netns/cni-94195005-d032-833a-35c3-cbd55426d8e9" May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:25:59.960 [INFO][4580] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" iface="eth0" netns="/var/run/netns/cni-94195005-d032-833a-35c3-cbd55426d8e9" May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:25:59.960 [INFO][4580] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" iface="eth0" netns="/var/run/netns/cni-94195005-d032-833a-35c3-cbd55426d8e9" May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:25:59.960 [INFO][4580] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:25:59.960 [INFO][4580] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:26:00.000 [INFO][4622] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" HandleID="k8s-pod-network.e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:26:00.000 [INFO][4622] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:26:00.006 [INFO][4622] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:26:00.013 [WARNING][4622] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" HandleID="k8s-pod-network.e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:26:00.013 [INFO][4622] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" HandleID="k8s-pod-network.e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:26:00.015 [INFO][4622] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:00.033978 containerd[1455]: 2025-05-13 00:26:00.019 [INFO][4580] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:00.034917 containerd[1455]: time="2025-05-13T00:26:00.034121226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-t7b9x,Uid:44594c6e-aa9f-4fbd-9913-db1bedcc1034,Namespace:kube-system,Attempt:1,}" May 13 00:26:00.038211 systemd[1]: Started session-11.scope - Session 11 of User core. May 13 00:26:00.042077 systemd[1]: run-netns-cni\x2d94195005\x2dd032\x2d833a\x2d35c3\x2dcbd55426d8e9.mount: Deactivated successfully. May 13 00:26:00.045654 containerd[1455]: time="2025-05-13T00:26:00.045628874Z" level=info msg="TearDown network for sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\" successfully" May 13 00:26:00.045738 containerd[1455]: time="2025-05-13T00:26:00.045724805Z" level=info msg="StopPodSandbox for \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\" returns successfully" May 13 00:26:00.048212 containerd[1455]: time="2025-05-13T00:26:00.048191573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb7cb75b-h6m65,Uid:3654df1f-4f7d-4d28-8abd-dfe02b50107c,Namespace:calico-apiserver,Attempt:1,}" May 13 00:26:00.181075 sshd[4603]: pam_unix(sshd:session): session closed for user core May 13 00:26:00.190614 systemd[1]: sshd@10-10.0.0.89:22-10.0.0.1:54098.service: Deactivated successfully. May 13 00:26:00.193473 systemd[1]: session-11.scope: Deactivated successfully. May 13 00:26:00.194419 systemd-logind[1436]: Session 11 logged out. Waiting for processes to exit. May 13 00:26:00.196616 systemd-logind[1436]: Removed session 11. May 13 00:26:00.203506 systemd[1]: Started sshd@11-10.0.0.89:22-10.0.0.1:54104.service - OpenSSH per-connection server daemon (10.0.0.1:54104). May 13 00:26:00.233456 sshd[4695]: Accepted publickey for core from 10.0.0.1 port 54104 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:00.235248 sshd[4695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:00.239674 systemd-logind[1436]: New session 12 of user core. May 13 00:26:00.249782 systemd[1]: Started session-12.scope - Session 12 of User core. May 13 00:26:00.296196 systemd-networkd[1385]: caliab44238195e: Link UP May 13 00:26:00.296413 systemd-networkd[1385]: caliab44238195e: Gained carrier May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.207 [INFO][4647] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--xgblg-eth0 csi-node-driver- calico-system 93974201-6dc9-4f10-8077-3a1e2417dccb 886 0 2025-05-13 00:25:31 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5b5cc68cd5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-xgblg eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliab44238195e [] []}} ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Namespace="calico-system" Pod="csi-node-driver-xgblg" WorkloadEndpoint="localhost-k8s-csi--node--driver--xgblg-" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.207 [INFO][4647] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Namespace="calico-system" Pod="csi-node-driver-xgblg" WorkloadEndpoint="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.253 [INFO][4705] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" HandleID="k8s-pod-network.80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.263 [INFO][4705] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" HandleID="k8s-pod-network.80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003895f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-xgblg", "timestamp":"2025-05-13 00:26:00.253087933 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.264 [INFO][4705] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.264 [INFO][4705] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.264 [INFO][4705] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.267 [INFO][4705] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" host="localhost" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.271 [INFO][4705] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.276 [INFO][4705] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.278 [INFO][4705] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.280 [INFO][4705] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.280 [INFO][4705] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" host="localhost" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.281 [INFO][4705] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0 May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.284 [INFO][4705] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" host="localhost" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.290 [INFO][4705] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" host="localhost" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.290 [INFO][4705] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" host="localhost" May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.290 [INFO][4705] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:00.313042 containerd[1455]: 2025-05-13 00:26:00.290 [INFO][4705] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" HandleID="k8s-pod-network.80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:00.313701 containerd[1455]: 2025-05-13 00:26:00.293 [INFO][4647] cni-plugin/k8s.go 386: Populated endpoint ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Namespace="calico-system" Pod="csi-node-driver-xgblg" WorkloadEndpoint="localhost-k8s-csi--node--driver--xgblg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--xgblg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"93974201-6dc9-4f10-8077-3a1e2417dccb", ResourceVersion:"886", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-xgblg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliab44238195e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:00.313701 containerd[1455]: 2025-05-13 00:26:00.293 [INFO][4647] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Namespace="calico-system" Pod="csi-node-driver-xgblg" WorkloadEndpoint="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:00.313701 containerd[1455]: 2025-05-13 00:26:00.293 [INFO][4647] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliab44238195e ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Namespace="calico-system" Pod="csi-node-driver-xgblg" WorkloadEndpoint="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:00.313701 containerd[1455]: 2025-05-13 00:26:00.295 [INFO][4647] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Namespace="calico-system" Pod="csi-node-driver-xgblg" WorkloadEndpoint="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:00.313701 containerd[1455]: 2025-05-13 00:26:00.296 [INFO][4647] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Namespace="calico-system" Pod="csi-node-driver-xgblg" WorkloadEndpoint="localhost-k8s-csi--node--driver--xgblg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--xgblg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"93974201-6dc9-4f10-8077-3a1e2417dccb", ResourceVersion:"886", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0", Pod:"csi-node-driver-xgblg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliab44238195e", MAC:"a2:a3:90:4f:01:35", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:00.313701 containerd[1455]: 2025-05-13 00:26:00.309 [INFO][4647] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0" Namespace="calico-system" Pod="csi-node-driver-xgblg" WorkloadEndpoint="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:00.437855 systemd-networkd[1385]: califf2620a7d8d: Link UP May 13 00:26:00.438531 systemd-networkd[1385]: califf2620a7d8d: Gained carrier May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.209 [INFO][4657] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0 coredns-668d6bf9bc- kube-system 44594c6e-aa9f-4fbd-9913-db1bedcc1034 887 0 2025-05-13 00:25:24 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-t7b9x eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califf2620a7d8d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Namespace="kube-system" Pod="coredns-668d6bf9bc-t7b9x" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--t7b9x-" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.209 [INFO][4657] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Namespace="kube-system" Pod="coredns-668d6bf9bc-t7b9x" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.252 [INFO][4699] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" HandleID="k8s-pod-network.54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.264 [INFO][4699] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" HandleID="k8s-pod-network.54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000375d80), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-t7b9x", "timestamp":"2025-05-13 00:26:00.25240582 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.264 [INFO][4699] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.291 [INFO][4699] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.291 [INFO][4699] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.368 [INFO][4699] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" host="localhost" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.373 [INFO][4699] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.376 [INFO][4699] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.377 [INFO][4699] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.379 [INFO][4699] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.380 [INFO][4699] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" host="localhost" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.381 [INFO][4699] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.390 [INFO][4699] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" host="localhost" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.420 [INFO][4699] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" host="localhost" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.420 [INFO][4699] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" host="localhost" May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.420 [INFO][4699] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:00.457359 containerd[1455]: 2025-05-13 00:26:00.420 [INFO][4699] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" HandleID="k8s-pod-network.54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:00.457993 containerd[1455]: 2025-05-13 00:26:00.430 [INFO][4657] cni-plugin/k8s.go 386: Populated endpoint ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Namespace="kube-system" Pod="coredns-668d6bf9bc-t7b9x" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"44594c6e-aa9f-4fbd-9913-db1bedcc1034", ResourceVersion:"887", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-t7b9x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califf2620a7d8d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:00.457993 containerd[1455]: 2025-05-13 00:26:00.430 [INFO][4657] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Namespace="kube-system" Pod="coredns-668d6bf9bc-t7b9x" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:00.457993 containerd[1455]: 2025-05-13 00:26:00.430 [INFO][4657] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califf2620a7d8d ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Namespace="kube-system" Pod="coredns-668d6bf9bc-t7b9x" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:00.457993 containerd[1455]: 2025-05-13 00:26:00.434 [INFO][4657] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Namespace="kube-system" Pod="coredns-668d6bf9bc-t7b9x" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:00.457993 containerd[1455]: 2025-05-13 00:26:00.435 [INFO][4657] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Namespace="kube-system" Pod="coredns-668d6bf9bc-t7b9x" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"44594c6e-aa9f-4fbd-9913-db1bedcc1034", ResourceVersion:"887", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d", Pod:"coredns-668d6bf9bc-t7b9x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califf2620a7d8d", MAC:"3a:50:5c:43:1b:a1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:00.457993 containerd[1455]: 2025-05-13 00:26:00.451 [INFO][4657] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d" Namespace="kube-system" Pod="coredns-668d6bf9bc-t7b9x" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:00.464999 containerd[1455]: time="2025-05-13T00:26:00.464930463Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:26:00.465234 containerd[1455]: time="2025-05-13T00:26:00.464979826Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:26:00.465234 containerd[1455]: time="2025-05-13T00:26:00.465013780Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:26:00.465234 containerd[1455]: time="2025-05-13T00:26:00.465125610Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:26:00.497749 systemd[1]: Started cri-containerd-80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0.scope - libcontainer container 80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0. May 13 00:26:00.513716 systemd-resolved[1327]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 00:26:00.525662 containerd[1455]: time="2025-05-13T00:26:00.525625529Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xgblg,Uid:93974201-6dc9-4f10-8077-3a1e2417dccb,Namespace:calico-system,Attempt:1,} returns sandbox id \"80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0\"" May 13 00:26:00.612116 sshd[4695]: pam_unix(sshd:session): session closed for user core May 13 00:26:00.617109 systemd-networkd[1385]: calidcedcd1e742: Link UP May 13 00:26:00.617977 systemd-networkd[1385]: calidcedcd1e742: Gained carrier May 13 00:26:00.626629 systemd[1]: sshd@11-10.0.0.89:22-10.0.0.1:54104.service: Deactivated successfully. May 13 00:26:00.629343 systemd[1]: session-12.scope: Deactivated successfully. May 13 00:26:00.634218 systemd-logind[1436]: Session 12 logged out. Waiting for processes to exit. May 13 00:26:00.641268 containerd[1455]: time="2025-05-13T00:26:00.640832505Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:26:00.641268 containerd[1455]: time="2025-05-13T00:26:00.640896124Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:26:00.641268 containerd[1455]: time="2025-05-13T00:26:00.640917955Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:26:00.642655 containerd[1455]: time="2025-05-13T00:26:00.641011361Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.217 [INFO][4670] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0 calico-apiserver-9bb7cb75b- calico-apiserver 3654df1f-4f7d-4d28-8abd-dfe02b50107c 891 0 2025-05-13 00:25:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9bb7cb75b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-9bb7cb75b-h6m65 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidcedcd1e742 [] []}} ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-h6m65" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.217 [INFO][4670] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-h6m65" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.255 [INFO][4711] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" HandleID="k8s-pod-network.bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.267 [INFO][4711] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" HandleID="k8s-pod-network.bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003e3d50), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-9bb7cb75b-h6m65", "timestamp":"2025-05-13 00:26:00.255272621 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.267 [INFO][4711] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.421 [INFO][4711] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.421 [INFO][4711] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.471 [INFO][4711] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" host="localhost" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.476 [INFO][4711] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.480 [INFO][4711] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.481 [INFO][4711] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.484 [INFO][4711] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.484 [INFO][4711] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" host="localhost" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.486 [INFO][4711] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509 May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.509 [INFO][4711] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" host="localhost" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.604 [INFO][4711] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" host="localhost" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.604 [INFO][4711] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" host="localhost" May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.604 [INFO][4711] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:00.646280 containerd[1455]: 2025-05-13 00:26:00.604 [INFO][4711] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" HandleID="k8s-pod-network.bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:00.646990 containerd[1455]: 2025-05-13 00:26:00.608 [INFO][4670] cni-plugin/k8s.go 386: Populated endpoint ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-h6m65" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0", GenerateName:"calico-apiserver-9bb7cb75b-", Namespace:"calico-apiserver", SelfLink:"", UID:"3654df1f-4f7d-4d28-8abd-dfe02b50107c", ResourceVersion:"891", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bb7cb75b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-9bb7cb75b-h6m65", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidcedcd1e742", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:00.646990 containerd[1455]: 2025-05-13 00:26:00.608 [INFO][4670] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-h6m65" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:00.646990 containerd[1455]: 2025-05-13 00:26:00.608 [INFO][4670] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidcedcd1e742 ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-h6m65" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:00.646990 containerd[1455]: 2025-05-13 00:26:00.618 [INFO][4670] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-h6m65" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:00.646990 containerd[1455]: 2025-05-13 00:26:00.619 [INFO][4670] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-h6m65" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0", GenerateName:"calico-apiserver-9bb7cb75b-", Namespace:"calico-apiserver", SelfLink:"", UID:"3654df1f-4f7d-4d28-8abd-dfe02b50107c", ResourceVersion:"891", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bb7cb75b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509", Pod:"calico-apiserver-9bb7cb75b-h6m65", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidcedcd1e742", MAC:"e2:02:d1:c1:08:04", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:00.646990 containerd[1455]: 2025-05-13 00:26:00.641 [INFO][4670] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509" Namespace="calico-apiserver" Pod="calico-apiserver-9bb7cb75b-h6m65" WorkloadEndpoint="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:00.651234 systemd[1]: Started sshd@12-10.0.0.89:22-10.0.0.1:54114.service - OpenSSH per-connection server daemon (10.0.0.1:54114). May 13 00:26:00.652973 systemd-logind[1436]: Removed session 12. May 13 00:26:00.670058 systemd[1]: Started cri-containerd-54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d.scope - libcontainer container 54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d. May 13 00:26:00.687326 systemd-resolved[1327]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 00:26:00.688450 sshd[4824]: Accepted publickey for core from 10.0.0.1 port 54114 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:00.692013 sshd[4824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:00.696702 containerd[1455]: time="2025-05-13T00:26:00.693474829Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 May 13 00:26:00.696702 containerd[1455]: time="2025-05-13T00:26:00.693627255Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 May 13 00:26:00.696702 containerd[1455]: time="2025-05-13T00:26:00.693680055Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:26:00.696702 containerd[1455]: time="2025-05-13T00:26:00.694003333Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 May 13 00:26:00.699231 systemd-logind[1436]: New session 13 of user core. May 13 00:26:00.704828 systemd[1]: Started session-13.scope - Session 13 of User core. May 13 00:26:00.720735 systemd[1]: Started cri-containerd-bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509.scope - libcontainer container bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509. May 13 00:26:00.722343 containerd[1455]: time="2025-05-13T00:26:00.722308376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-t7b9x,Uid:44594c6e-aa9f-4fbd-9913-db1bedcc1034,Namespace:kube-system,Attempt:1,} returns sandbox id \"54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d\"" May 13 00:26:00.723437 kubelet[2476]: E0513 00:26:00.723263 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:26:00.727876 containerd[1455]: time="2025-05-13T00:26:00.727832101Z" level=info msg="CreateContainer within sandbox \"54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 00:26:00.733118 systemd-resolved[1327]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 00:26:00.762438 containerd[1455]: time="2025-05-13T00:26:00.762393898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb7cb75b-h6m65,Uid:3654df1f-4f7d-4d28-8abd-dfe02b50107c,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509\"" May 13 00:26:01.048434 sshd[4824]: pam_unix(sshd:session): session closed for user core May 13 00:26:01.053326 systemd[1]: sshd@12-10.0.0.89:22-10.0.0.1:54114.service: Deactivated successfully. May 13 00:26:01.061362 systemd[1]: session-13.scope: Deactivated successfully. May 13 00:26:01.062182 systemd-logind[1436]: Session 13 logged out. Waiting for processes to exit. May 13 00:26:01.063428 systemd-logind[1436]: Removed session 13. May 13 00:26:01.064036 containerd[1455]: time="2025-05-13T00:26:01.063837098Z" level=info msg="CreateContainer within sandbox \"54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"701705088106e1d038528651917b581e549eae9ffed5422d09bdf5258e5f113e\"" May 13 00:26:01.064376 containerd[1455]: time="2025-05-13T00:26:01.064344703Z" level=info msg="StartContainer for \"701705088106e1d038528651917b581e549eae9ffed5422d09bdf5258e5f113e\"" May 13 00:26:01.073121 containerd[1455]: time="2025-05-13T00:26:01.073062332Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:01.074082 containerd[1455]: time="2025-05-13T00:26:01.073821941Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 13 00:26:01.076272 containerd[1455]: time="2025-05-13T00:26:01.076237852Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:01.079099 containerd[1455]: time="2025-05-13T00:26:01.079038808Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:01.079809 containerd[1455]: time="2025-05-13T00:26:01.079752389Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 3.633784073s" May 13 00:26:01.079866 containerd[1455]: time="2025-05-13T00:26:01.079817441Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 13 00:26:01.081473 containerd[1455]: time="2025-05-13T00:26:01.081293948Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 13 00:26:01.083810 containerd[1455]: time="2025-05-13T00:26:01.083771155Z" level=info msg="CreateContainer within sandbox \"1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 00:26:01.097671 systemd[1]: Started cri-containerd-701705088106e1d038528651917b581e549eae9ffed5422d09bdf5258e5f113e.scope - libcontainer container 701705088106e1d038528651917b581e549eae9ffed5422d09bdf5258e5f113e. May 13 00:26:01.099579 containerd[1455]: time="2025-05-13T00:26:01.099515976Z" level=info msg="CreateContainer within sandbox \"1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"48ff54d6307d358dea74d8e38769d5b547a119417a706c40bcab2a39c6c21800\"" May 13 00:26:01.100976 containerd[1455]: time="2025-05-13T00:26:01.100287757Z" level=info msg="StartContainer for \"48ff54d6307d358dea74d8e38769d5b547a119417a706c40bcab2a39c6c21800\"" May 13 00:26:01.128703 systemd[1]: Started cri-containerd-48ff54d6307d358dea74d8e38769d5b547a119417a706c40bcab2a39c6c21800.scope - libcontainer container 48ff54d6307d358dea74d8e38769d5b547a119417a706c40bcab2a39c6c21800. May 13 00:26:01.205037 containerd[1455]: time="2025-05-13T00:26:01.204968018Z" level=info msg="StartContainer for \"701705088106e1d038528651917b581e549eae9ffed5422d09bdf5258e5f113e\" returns successfully" May 13 00:26:01.205304 containerd[1455]: time="2025-05-13T00:26:01.205067255Z" level=info msg="StartContainer for \"48ff54d6307d358dea74d8e38769d5b547a119417a706c40bcab2a39c6c21800\" returns successfully" May 13 00:26:01.516691 systemd-networkd[1385]: caliab44238195e: Gained IPv6LL May 13 00:26:01.904523 kubelet[2476]: E0513 00:26:01.904397 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:26:02.009710 systemd[1]: run-containerd-runc-k8s.io-701705088106e1d038528651917b581e549eae9ffed5422d09bdf5258e5f113e-runc.uzTYtO.mount: Deactivated successfully. May 13 00:26:02.089677 systemd-networkd[1385]: califf2620a7d8d: Gained IPv6LL May 13 00:26:02.266447 kubelet[2476]: I0513 00:26:02.265159 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-t7b9x" podStartSLOduration=38.265139409 podStartE2EDuration="38.265139409s" podCreationTimestamp="2025-05-13 00:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 00:26:02.265028741 +0000 UTC m=+43.780608618" watchObservedRunningTime="2025-05-13 00:26:02.265139409 +0000 UTC m=+43.780719256" May 13 00:26:02.266447 kubelet[2476]: I0513 00:26:02.265290 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-9bb7cb75b-6pgjr" podStartSLOduration=27.628736831 podStartE2EDuration="31.265284411s" podCreationTimestamp="2025-05-13 00:25:31 +0000 UTC" firstStartedPulling="2025-05-13 00:25:57.444528488 +0000 UTC m=+38.960108335" lastFinishedPulling="2025-05-13 00:26:01.081076078 +0000 UTC m=+42.596655915" observedRunningTime="2025-05-13 00:26:02.071701082 +0000 UTC m=+43.587280929" watchObservedRunningTime="2025-05-13 00:26:02.265284411 +0000 UTC m=+43.780864258" May 13 00:26:02.602726 systemd-networkd[1385]: calidcedcd1e742: Gained IPv6LL May 13 00:26:02.907074 kubelet[2476]: E0513 00:26:02.906833 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:26:03.908128 kubelet[2476]: E0513 00:26:03.908097 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:26:03.938345 containerd[1455]: time="2025-05-13T00:26:03.938296974Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:03.939232 containerd[1455]: time="2025-05-13T00:26:03.939176848Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 13 00:26:03.940401 containerd[1455]: time="2025-05-13T00:26:03.940362946Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:03.942444 containerd[1455]: time="2025-05-13T00:26:03.942413611Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:03.943056 containerd[1455]: time="2025-05-13T00:26:03.943023888Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 2.861697729s" May 13 00:26:03.943095 containerd[1455]: time="2025-05-13T00:26:03.943054456Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 13 00:26:03.944066 containerd[1455]: time="2025-05-13T00:26:03.944035820Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 13 00:26:03.951151 containerd[1455]: time="2025-05-13T00:26:03.951006281Z" level=info msg="CreateContainer within sandbox \"9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 13 00:26:03.966562 containerd[1455]: time="2025-05-13T00:26:03.966487729Z" level=info msg="CreateContainer within sandbox \"9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"d69892b545250cbe917c03a3e047a7c5ac3d7cd3c98a0bd7137c00101db6ad50\"" May 13 00:26:03.967049 containerd[1455]: time="2025-05-13T00:26:03.967014349Z" level=info msg="StartContainer for \"d69892b545250cbe917c03a3e047a7c5ac3d7cd3c98a0bd7137c00101db6ad50\"" May 13 00:26:03.999945 systemd[1]: Started cri-containerd-d69892b545250cbe917c03a3e047a7c5ac3d7cd3c98a0bd7137c00101db6ad50.scope - libcontainer container d69892b545250cbe917c03a3e047a7c5ac3d7cd3c98a0bd7137c00101db6ad50. May 13 00:26:04.057278 containerd[1455]: time="2025-05-13T00:26:04.057229815Z" level=info msg="StartContainer for \"d69892b545250cbe917c03a3e047a7c5ac3d7cd3c98a0bd7137c00101db6ad50\" returns successfully" May 13 00:26:04.977736 kubelet[2476]: I0513 00:26:04.977591 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-79ff8699d-dbkcf" podStartSLOduration=27.903887192 podStartE2EDuration="33.977569872s" podCreationTimestamp="2025-05-13 00:25:31 +0000 UTC" firstStartedPulling="2025-05-13 00:25:57.870245748 +0000 UTC m=+39.385825595" lastFinishedPulling="2025-05-13 00:26:03.943928428 +0000 UTC m=+45.459508275" observedRunningTime="2025-05-13 00:26:04.922823529 +0000 UTC m=+46.438403386" watchObservedRunningTime="2025-05-13 00:26:04.977569872 +0000 UTC m=+46.493149719" May 13 00:26:06.066792 systemd[1]: Started sshd@13-10.0.0.89:22-10.0.0.1:54130.service - OpenSSH per-connection server daemon (10.0.0.1:54130). May 13 00:26:06.496381 sshd[5079]: Accepted publickey for core from 10.0.0.1 port 54130 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:06.497903 sshd[5079]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:06.503013 systemd-logind[1436]: New session 14 of user core. May 13 00:26:06.509693 systemd[1]: Started session-14.scope - Session 14 of User core. May 13 00:26:06.671055 sshd[5079]: pam_unix(sshd:session): session closed for user core May 13 00:26:06.676338 systemd[1]: sshd@13-10.0.0.89:22-10.0.0.1:54130.service: Deactivated successfully. May 13 00:26:06.679122 systemd[1]: session-14.scope: Deactivated successfully. May 13 00:26:06.680236 systemd-logind[1436]: Session 14 logged out. Waiting for processes to exit. May 13 00:26:06.681419 systemd-logind[1436]: Removed session 14. May 13 00:26:07.121068 containerd[1455]: time="2025-05-13T00:26:07.121005344Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:07.186215 containerd[1455]: time="2025-05-13T00:26:07.186128027Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 13 00:26:07.188389 containerd[1455]: time="2025-05-13T00:26:07.188345953Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:07.191002 containerd[1455]: time="2025-05-13T00:26:07.190950466Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:07.191606 containerd[1455]: time="2025-05-13T00:26:07.191565071Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 3.247494947s" May 13 00:26:07.191645 containerd[1455]: time="2025-05-13T00:26:07.191602661Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 13 00:26:07.192673 containerd[1455]: time="2025-05-13T00:26:07.192631234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 00:26:07.193636 containerd[1455]: time="2025-05-13T00:26:07.193600896Z" level=info msg="CreateContainer within sandbox \"80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 13 00:26:07.212338 containerd[1455]: time="2025-05-13T00:26:07.212281640Z" level=info msg="CreateContainer within sandbox \"80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d4948e3ed5cf469bbc4aba088417fc127a384ccc285a1df835740c34bce12da4\"" May 13 00:26:07.213669 containerd[1455]: time="2025-05-13T00:26:07.212882538Z" level=info msg="StartContainer for \"d4948e3ed5cf469bbc4aba088417fc127a384ccc285a1df835740c34bce12da4\"" May 13 00:26:07.249669 systemd[1]: Started cri-containerd-d4948e3ed5cf469bbc4aba088417fc127a384ccc285a1df835740c34bce12da4.scope - libcontainer container d4948e3ed5cf469bbc4aba088417fc127a384ccc285a1df835740c34bce12da4. May 13 00:26:07.279421 containerd[1455]: time="2025-05-13T00:26:07.279367299Z" level=info msg="StartContainer for \"d4948e3ed5cf469bbc4aba088417fc127a384ccc285a1df835740c34bce12da4\" returns successfully" May 13 00:26:07.593662 containerd[1455]: time="2025-05-13T00:26:07.593604198Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:07.594582 containerd[1455]: time="2025-05-13T00:26:07.594512575Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 13 00:26:07.596714 containerd[1455]: time="2025-05-13T00:26:07.596685587Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 404.021061ms" May 13 00:26:07.596779 containerd[1455]: time="2025-05-13T00:26:07.596716585Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 13 00:26:07.597591 containerd[1455]: time="2025-05-13T00:26:07.597459361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 13 00:26:07.598685 containerd[1455]: time="2025-05-13T00:26:07.598578564Z" level=info msg="CreateContainer within sandbox \"bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 00:26:07.613217 containerd[1455]: time="2025-05-13T00:26:07.613165467Z" level=info msg="CreateContainer within sandbox \"bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1afc9310537e2cce833b82ace635b4b622f79d95357f4b6a1f0b45bdcfbd60fc\"" May 13 00:26:07.613716 containerd[1455]: time="2025-05-13T00:26:07.613692006Z" level=info msg="StartContainer for \"1afc9310537e2cce833b82ace635b4b622f79d95357f4b6a1f0b45bdcfbd60fc\"" May 13 00:26:07.646721 systemd[1]: Started cri-containerd-1afc9310537e2cce833b82ace635b4b622f79d95357f4b6a1f0b45bdcfbd60fc.scope - libcontainer container 1afc9310537e2cce833b82ace635b4b622f79d95357f4b6a1f0b45bdcfbd60fc. May 13 00:26:07.691248 containerd[1455]: time="2025-05-13T00:26:07.691193434Z" level=info msg="StartContainer for \"1afc9310537e2cce833b82ace635b4b622f79d95357f4b6a1f0b45bdcfbd60fc\" returns successfully" May 13 00:26:07.929991 kubelet[2476]: I0513 00:26:07.929557 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-9bb7cb75b-h6m65" podStartSLOduration=30.09722386 podStartE2EDuration="36.929526006s" podCreationTimestamp="2025-05-13 00:25:31 +0000 UTC" firstStartedPulling="2025-05-13 00:26:00.765068598 +0000 UTC m=+42.280648445" lastFinishedPulling="2025-05-13 00:26:07.597370744 +0000 UTC m=+49.112950591" observedRunningTime="2025-05-13 00:26:07.929313797 +0000 UTC m=+49.444893644" watchObservedRunningTime="2025-05-13 00:26:07.929526006 +0000 UTC m=+49.445105853" May 13 00:26:08.923373 kubelet[2476]: I0513 00:26:08.923335 2476 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 00:26:09.821913 containerd[1455]: time="2025-05-13T00:26:09.821854931Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:09.823073 containerd[1455]: time="2025-05-13T00:26:09.823030911Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 13 00:26:09.824251 containerd[1455]: time="2025-05-13T00:26:09.824221137Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:09.826555 containerd[1455]: time="2025-05-13T00:26:09.826495458Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 00:26:09.827087 containerd[1455]: time="2025-05-13T00:26:09.827055590Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 2.229571463s" May 13 00:26:09.827123 containerd[1455]: time="2025-05-13T00:26:09.827086658Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 13 00:26:09.829182 containerd[1455]: time="2025-05-13T00:26:09.829122682Z" level=info msg="CreateContainer within sandbox \"80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 13 00:26:09.843146 containerd[1455]: time="2025-05-13T00:26:09.843108271Z" level=info msg="CreateContainer within sandbox \"80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"7e76edde89d8b612a524b74a0c3bd4fc482df41d197839667873fa03fc95ecc5\"" May 13 00:26:09.843686 containerd[1455]: time="2025-05-13T00:26:09.843627087Z" level=info msg="StartContainer for \"7e76edde89d8b612a524b74a0c3bd4fc482df41d197839667873fa03fc95ecc5\"" May 13 00:26:09.875673 systemd[1]: Started cri-containerd-7e76edde89d8b612a524b74a0c3bd4fc482df41d197839667873fa03fc95ecc5.scope - libcontainer container 7e76edde89d8b612a524b74a0c3bd4fc482df41d197839667873fa03fc95ecc5. May 13 00:26:09.903446 containerd[1455]: time="2025-05-13T00:26:09.903402191Z" level=info msg="StartContainer for \"7e76edde89d8b612a524b74a0c3bd4fc482df41d197839667873fa03fc95ecc5\" returns successfully" May 13 00:26:09.936935 kubelet[2476]: I0513 00:26:09.936859 2476 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-xgblg" podStartSLOduration=29.635820831 podStartE2EDuration="38.936844966s" podCreationTimestamp="2025-05-13 00:25:31 +0000 UTC" firstStartedPulling="2025-05-13 00:26:00.526844863 +0000 UTC m=+42.042424710" lastFinishedPulling="2025-05-13 00:26:09.827868998 +0000 UTC m=+51.343448845" observedRunningTime="2025-05-13 00:26:09.936520897 +0000 UTC m=+51.452100744" watchObservedRunningTime="2025-05-13 00:26:09.936844966 +0000 UTC m=+51.452424813" May 13 00:26:10.617900 kubelet[2476]: I0513 00:26:10.617864 2476 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 13 00:26:10.617900 kubelet[2476]: I0513 00:26:10.617895 2476 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 13 00:26:11.683474 systemd[1]: Started sshd@14-10.0.0.89:22-10.0.0.1:58694.service - OpenSSH per-connection server daemon (10.0.0.1:58694). May 13 00:26:11.721959 sshd[5230]: Accepted publickey for core from 10.0.0.1 port 58694 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:11.723870 sshd[5230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:11.728233 systemd-logind[1436]: New session 15 of user core. May 13 00:26:11.741717 systemd[1]: Started session-15.scope - Session 15 of User core. May 13 00:26:11.864298 sshd[5230]: pam_unix(sshd:session): session closed for user core May 13 00:26:11.875319 systemd[1]: sshd@14-10.0.0.89:22-10.0.0.1:58694.service: Deactivated successfully. May 13 00:26:11.877071 systemd[1]: session-15.scope: Deactivated successfully. May 13 00:26:11.878672 systemd-logind[1436]: Session 15 logged out. Waiting for processes to exit. May 13 00:26:11.887775 systemd[1]: Started sshd@15-10.0.0.89:22-10.0.0.1:58704.service - OpenSSH per-connection server daemon (10.0.0.1:58704). May 13 00:26:11.888732 systemd-logind[1436]: Removed session 15. May 13 00:26:11.919825 sshd[5244]: Accepted publickey for core from 10.0.0.1 port 58704 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:11.921525 sshd[5244]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:11.925908 systemd-logind[1436]: New session 16 of user core. May 13 00:26:11.936771 systemd[1]: Started session-16.scope - Session 16 of User core. May 13 00:26:12.171995 sshd[5244]: pam_unix(sshd:session): session closed for user core May 13 00:26:12.184416 systemd[1]: sshd@15-10.0.0.89:22-10.0.0.1:58704.service: Deactivated successfully. May 13 00:26:12.186307 systemd[1]: session-16.scope: Deactivated successfully. May 13 00:26:12.188160 systemd-logind[1436]: Session 16 logged out. Waiting for processes to exit. May 13 00:26:12.192998 systemd[1]: Started sshd@16-10.0.0.89:22-10.0.0.1:58718.service - OpenSSH per-connection server daemon (10.0.0.1:58718). May 13 00:26:12.194012 systemd-logind[1436]: Removed session 16. May 13 00:26:12.224801 sshd[5257]: Accepted publickey for core from 10.0.0.1 port 58718 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:12.226306 sshd[5257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:12.230344 systemd-logind[1436]: New session 17 of user core. May 13 00:26:12.240668 systemd[1]: Started session-17.scope - Session 17 of User core. May 13 00:26:13.363424 sshd[5257]: pam_unix(sshd:session): session closed for user core May 13 00:26:13.376687 systemd[1]: sshd@16-10.0.0.89:22-10.0.0.1:58718.service: Deactivated successfully. May 13 00:26:13.379003 systemd[1]: session-17.scope: Deactivated successfully. May 13 00:26:13.380193 systemd-logind[1436]: Session 17 logged out. Waiting for processes to exit. May 13 00:26:13.384384 systemd-logind[1436]: Removed session 17. May 13 00:26:13.388854 systemd[1]: Started sshd@17-10.0.0.89:22-10.0.0.1:58724.service - OpenSSH per-connection server daemon (10.0.0.1:58724). May 13 00:26:13.425505 sshd[5277]: Accepted publickey for core from 10.0.0.1 port 58724 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:13.426979 sshd[5277]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:13.431467 systemd-logind[1436]: New session 18 of user core. May 13 00:26:13.440691 systemd[1]: Started session-18.scope - Session 18 of User core. May 13 00:26:13.660147 sshd[5277]: pam_unix(sshd:session): session closed for user core May 13 00:26:13.669807 systemd[1]: sshd@17-10.0.0.89:22-10.0.0.1:58724.service: Deactivated successfully. May 13 00:26:13.671804 systemd[1]: session-18.scope: Deactivated successfully. May 13 00:26:13.673699 systemd-logind[1436]: Session 18 logged out. Waiting for processes to exit. May 13 00:26:13.684912 systemd[1]: Started sshd@18-10.0.0.89:22-10.0.0.1:58726.service - OpenSSH per-connection server daemon (10.0.0.1:58726). May 13 00:26:13.685966 systemd-logind[1436]: Removed session 18. May 13 00:26:13.715405 sshd[5290]: Accepted publickey for core from 10.0.0.1 port 58726 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:13.717119 sshd[5290]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:13.721750 systemd-logind[1436]: New session 19 of user core. May 13 00:26:13.731750 systemd[1]: Started session-19.scope - Session 19 of User core. May 13 00:26:13.843163 sshd[5290]: pam_unix(sshd:session): session closed for user core May 13 00:26:13.847393 systemd[1]: sshd@18-10.0.0.89:22-10.0.0.1:58726.service: Deactivated successfully. May 13 00:26:13.849556 systemd[1]: session-19.scope: Deactivated successfully. May 13 00:26:13.850146 systemd-logind[1436]: Session 19 logged out. Waiting for processes to exit. May 13 00:26:13.851127 systemd-logind[1436]: Removed session 19. May 13 00:26:15.907869 kubelet[2476]: I0513 00:26:15.907818 2476 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 00:26:18.551691 containerd[1455]: time="2025-05-13T00:26:18.551586610Z" level=info msg="StopPodSandbox for \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\"" May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.586 [WARNING][5328] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0", GenerateName:"calico-apiserver-9bb7cb75b-", Namespace:"calico-apiserver", SelfLink:"", UID:"3654df1f-4f7d-4d28-8abd-dfe02b50107c", ResourceVersion:"1078", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bb7cb75b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509", Pod:"calico-apiserver-9bb7cb75b-h6m65", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidcedcd1e742", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.586 [INFO][5328] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.586 [INFO][5328] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" iface="eth0" netns="" May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.586 [INFO][5328] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.586 [INFO][5328] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.606 [INFO][5339] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" HandleID="k8s-pod-network.e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.606 [INFO][5339] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.606 [INFO][5339] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.611 [WARNING][5339] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" HandleID="k8s-pod-network.e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.611 [INFO][5339] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" HandleID="k8s-pod-network.e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.612 [INFO][5339] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:18.617758 containerd[1455]: 2025-05-13 00:26:18.614 [INFO][5328] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:18.618202 containerd[1455]: time="2025-05-13T00:26:18.617799020Z" level=info msg="TearDown network for sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\" successfully" May 13 00:26:18.618202 containerd[1455]: time="2025-05-13T00:26:18.617829788Z" level=info msg="StopPodSandbox for \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\" returns successfully" May 13 00:26:18.625711 containerd[1455]: time="2025-05-13T00:26:18.625682179Z" level=info msg="RemovePodSandbox for \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\"" May 13 00:26:18.628135 containerd[1455]: time="2025-05-13T00:26:18.628112430Z" level=info msg="Forcibly stopping sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\"" May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.661 [WARNING][5362] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0", GenerateName:"calico-apiserver-9bb7cb75b-", Namespace:"calico-apiserver", SelfLink:"", UID:"3654df1f-4f7d-4d28-8abd-dfe02b50107c", ResourceVersion:"1078", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bb7cb75b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bb5d190abbbd2f31c34e02394b623894329354669eee1a34091e132c1edf6509", Pod:"calico-apiserver-9bb7cb75b-h6m65", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidcedcd1e742", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.661 [INFO][5362] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.661 [INFO][5362] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" iface="eth0" netns="" May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.661 [INFO][5362] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.661 [INFO][5362] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.681 [INFO][5370] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" HandleID="k8s-pod-network.e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.681 [INFO][5370] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.681 [INFO][5370] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.686 [WARNING][5370] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" HandleID="k8s-pod-network.e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.687 [INFO][5370] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" HandleID="k8s-pod-network.e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--h6m65-eth0" May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.687 [INFO][5370] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:18.692023 containerd[1455]: 2025-05-13 00:26:18.690 [INFO][5362] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890" May 13 00:26:18.692505 containerd[1455]: time="2025-05-13T00:26:18.692070159Z" level=info msg="TearDown network for sandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\" successfully" May 13 00:26:18.699329 containerd[1455]: time="2025-05-13T00:26:18.699293318Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." May 13 00:26:18.699368 containerd[1455]: time="2025-05-13T00:26:18.699348141Z" level=info msg="RemovePodSandbox \"e59413043bcd051d7a261fb0a9bba7ce6f05bf97cee2f77288435c8bc135a890\" returns successfully" May 13 00:26:18.699865 containerd[1455]: time="2025-05-13T00:26:18.699842850Z" level=info msg="StopPodSandbox for \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\"" May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.731 [WARNING][5393] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"44594c6e-aa9f-4fbd-9913-db1bedcc1034", ResourceVersion:"931", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d", Pod:"coredns-668d6bf9bc-t7b9x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califf2620a7d8d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.731 [INFO][5393] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.731 [INFO][5393] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" iface="eth0" netns="" May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.731 [INFO][5393] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.731 [INFO][5393] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.749 [INFO][5401] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" HandleID="k8s-pod-network.fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.749 [INFO][5401] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.749 [INFO][5401] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.754 [WARNING][5401] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" HandleID="k8s-pod-network.fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.754 [INFO][5401] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" HandleID="k8s-pod-network.fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.755 [INFO][5401] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:18.759833 containerd[1455]: 2025-05-13 00:26:18.757 [INFO][5393] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:18.760312 containerd[1455]: time="2025-05-13T00:26:18.759886841Z" level=info msg="TearDown network for sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\" successfully" May 13 00:26:18.760312 containerd[1455]: time="2025-05-13T00:26:18.759911167Z" level=info msg="StopPodSandbox for \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\" returns successfully" May 13 00:26:18.760394 containerd[1455]: time="2025-05-13T00:26:18.760324603Z" level=info msg="RemovePodSandbox for \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\"" May 13 00:26:18.760394 containerd[1455]: time="2025-05-13T00:26:18.760345462Z" level=info msg="Forcibly stopping sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\"" May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.791 [WARNING][5424] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"44594c6e-aa9f-4fbd-9913-db1bedcc1034", ResourceVersion:"931", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"54293280c695bdbbdf3080f857632ce1072d293d0f8b702cdb97f620696e126d", Pod:"coredns-668d6bf9bc-t7b9x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califf2620a7d8d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.791 [INFO][5424] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.791 [INFO][5424] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" iface="eth0" netns="" May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.791 [INFO][5424] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.791 [INFO][5424] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.809 [INFO][5433] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" HandleID="k8s-pod-network.fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.810 [INFO][5433] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.810 [INFO][5433] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.815 [WARNING][5433] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" HandleID="k8s-pod-network.fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.815 [INFO][5433] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" HandleID="k8s-pod-network.fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" Workload="localhost-k8s-coredns--668d6bf9bc--t7b9x-eth0" May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.816 [INFO][5433] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:18.821401 containerd[1455]: 2025-05-13 00:26:18.819 [INFO][5424] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a" May 13 00:26:18.821401 containerd[1455]: time="2025-05-13T00:26:18.821360747Z" level=info msg="TearDown network for sandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\" successfully" May 13 00:26:18.825917 containerd[1455]: time="2025-05-13T00:26:18.825876313Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." May 13 00:26:18.825917 containerd[1455]: time="2025-05-13T00:26:18.825927169Z" level=info msg="RemovePodSandbox \"fb7c8b5c4df7a3f91a0091f904cf06316bbbda3ef2179502e302e58bbae8be7a\" returns successfully" May 13 00:26:18.826439 containerd[1455]: time="2025-05-13T00:26:18.826405417Z" level=info msg="StopPodSandbox for \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\"" May 13 00:26:18.865832 systemd[1]: Started sshd@19-10.0.0.89:22-10.0.0.1:33384.service - OpenSSH per-connection server daemon (10.0.0.1:33384). May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.862 [WARNING][5456] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--tclg9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7de40860-c4d9-46ea-8cfc-0e1375b1cb33", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111", Pod:"coredns-668d6bf9bc-tclg9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali27a5de7679e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.862 [INFO][5456] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.862 [INFO][5456] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" iface="eth0" netns="" May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.862 [INFO][5456] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.862 [INFO][5456] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.881 [INFO][5467] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" HandleID="k8s-pod-network.4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.881 [INFO][5467] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.881 [INFO][5467] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.886 [WARNING][5467] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" HandleID="k8s-pod-network.4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.886 [INFO][5467] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" HandleID="k8s-pod-network.4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.888 [INFO][5467] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:18.892830 containerd[1455]: 2025-05-13 00:26:18.890 [INFO][5456] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:26:18.893309 containerd[1455]: time="2025-05-13T00:26:18.892845285Z" level=info msg="TearDown network for sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\" successfully" May 13 00:26:18.893309 containerd[1455]: time="2025-05-13T00:26:18.892869911Z" level=info msg="StopPodSandbox for \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\" returns successfully" May 13 00:26:18.893368 containerd[1455]: time="2025-05-13T00:26:18.893326318Z" level=info msg="RemovePodSandbox for \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\"" May 13 00:26:18.893368 containerd[1455]: time="2025-05-13T00:26:18.893363548Z" level=info msg="Forcibly stopping sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\"" May 13 00:26:18.900721 sshd[5464]: Accepted publickey for core from 10.0.0.1 port 33384 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:18.902277 sshd[5464]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:18.907070 systemd-logind[1436]: New session 20 of user core. May 13 00:26:18.912653 systemd[1]: Started session-20.scope - Session 20 of User core. May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.925 [WARNING][5491] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--tclg9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7de40860-c4d9-46ea-8cfc-0e1375b1cb33", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c319034d46098627ed66f87cca8bf24110a10acae27b3049570386cc904ed111", Pod:"coredns-668d6bf9bc-tclg9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali27a5de7679e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.925 [INFO][5491] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.925 [INFO][5491] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" iface="eth0" netns="" May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.925 [INFO][5491] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.925 [INFO][5491] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.948 [INFO][5502] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" HandleID="k8s-pod-network.4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.948 [INFO][5502] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.948 [INFO][5502] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.953 [WARNING][5502] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" HandleID="k8s-pod-network.4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.953 [INFO][5502] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" HandleID="k8s-pod-network.4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" Workload="localhost-k8s-coredns--668d6bf9bc--tclg9-eth0" May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.954 [INFO][5502] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:18.958773 containerd[1455]: 2025-05-13 00:26:18.956 [INFO][5491] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5" May 13 00:26:18.959343 containerd[1455]: time="2025-05-13T00:26:18.958805242Z" level=info msg="TearDown network for sandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\" successfully" May 13 00:26:18.963136 containerd[1455]: time="2025-05-13T00:26:18.963078543Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." May 13 00:26:18.963243 containerd[1455]: time="2025-05-13T00:26:18.963159896Z" level=info msg="RemovePodSandbox \"4bebfbdce5eb932309474ffa882fb7c78e1389e13eed19ae358a450745d091f5\" returns successfully" May 13 00:26:18.964209 containerd[1455]: time="2025-05-13T00:26:18.964040790Z" level=info msg="StopPodSandbox for \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\"" May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:18.997 [WARNING][5532] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--xgblg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"93974201-6dc9-4f10-8077-3a1e2417dccb", ResourceVersion:"1015", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0", Pod:"csi-node-driver-xgblg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliab44238195e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:18.998 [INFO][5532] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:18.998 [INFO][5532] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" iface="eth0" netns="" May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:18.998 [INFO][5532] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:18.998 [INFO][5532] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:19.017 [INFO][5541] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" HandleID="k8s-pod-network.510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:19.017 [INFO][5541] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:19.017 [INFO][5541] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:19.025 [WARNING][5541] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" HandleID="k8s-pod-network.510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:19.025 [INFO][5541] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" HandleID="k8s-pod-network.510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:19.026 [INFO][5541] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:19.032604 containerd[1455]: 2025-05-13 00:26:19.029 [INFO][5532] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:19.033194 containerd[1455]: time="2025-05-13T00:26:19.032647302Z" level=info msg="TearDown network for sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\" successfully" May 13 00:26:19.033194 containerd[1455]: time="2025-05-13T00:26:19.032672529Z" level=info msg="StopPodSandbox for \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\" returns successfully" May 13 00:26:19.033301 containerd[1455]: time="2025-05-13T00:26:19.033274830Z" level=info msg="RemovePodSandbox for \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\"" May 13 00:26:19.033343 containerd[1455]: time="2025-05-13T00:26:19.033305387Z" level=info msg="Forcibly stopping sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\"" May 13 00:26:19.038530 sshd[5464]: pam_unix(sshd:session): session closed for user core May 13 00:26:19.042141 systemd[1]: sshd@19-10.0.0.89:22-10.0.0.1:33384.service: Deactivated successfully. May 13 00:26:19.044138 systemd[1]: session-20.scope: Deactivated successfully. May 13 00:26:19.046170 systemd-logind[1436]: Session 20 logged out. Waiting for processes to exit. May 13 00:26:19.047113 systemd-logind[1436]: Removed session 20. May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.068 [WARNING][5564] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--xgblg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"93974201-6dc9-4f10-8077-3a1e2417dccb", ResourceVersion:"1015", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"80a7d2ad07edb17024c0021c6772e2e2dcc34154c1ed5c51e9f6d3c5141c28f0", Pod:"csi-node-driver-xgblg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliab44238195e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.068 [INFO][5564] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.068 [INFO][5564] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" iface="eth0" netns="" May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.068 [INFO][5564] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.068 [INFO][5564] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.088 [INFO][5574] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" HandleID="k8s-pod-network.510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.089 [INFO][5574] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.089 [INFO][5574] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.094 [WARNING][5574] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" HandleID="k8s-pod-network.510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.094 [INFO][5574] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" HandleID="k8s-pod-network.510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" Workload="localhost-k8s-csi--node--driver--xgblg-eth0" May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.096 [INFO][5574] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:19.100911 containerd[1455]: 2025-05-13 00:26:19.098 [INFO][5564] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25" May 13 00:26:19.100911 containerd[1455]: time="2025-05-13T00:26:19.100889407Z" level=info msg="TearDown network for sandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\" successfully" May 13 00:26:19.111651 containerd[1455]: time="2025-05-13T00:26:19.111580385Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." May 13 00:26:19.111651 containerd[1455]: time="2025-05-13T00:26:19.111655195Z" level=info msg="RemovePodSandbox \"510a32c2ce2e0658afd17435eff77d6854895b642031aa8fccb8f98dace4af25\" returns successfully" May 13 00:26:19.112157 containerd[1455]: time="2025-05-13T00:26:19.112126529Z" level=info msg="StopPodSandbox for \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\"" May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.146 [WARNING][5598] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0", GenerateName:"calico-apiserver-9bb7cb75b-", Namespace:"calico-apiserver", SelfLink:"", UID:"48db1284-6ae2-4942-9a9b-c26e0e5a874e", ResourceVersion:"942", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bb7cb75b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f", Pod:"calico-apiserver-9bb7cb75b-6pgjr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3bdf2d9dae4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.146 [INFO][5598] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.146 [INFO][5598] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" iface="eth0" netns="" May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.146 [INFO][5598] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.146 [INFO][5598] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.165 [INFO][5606] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" HandleID="k8s-pod-network.e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.166 [INFO][5606] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.166 [INFO][5606] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.171 [WARNING][5606] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" HandleID="k8s-pod-network.e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.171 [INFO][5606] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" HandleID="k8s-pod-network.e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.172 [INFO][5606] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:19.176990 containerd[1455]: 2025-05-13 00:26:19.174 [INFO][5598] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:26:19.177417 containerd[1455]: time="2025-05-13T00:26:19.177024377Z" level=info msg="TearDown network for sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\" successfully" May 13 00:26:19.177417 containerd[1455]: time="2025-05-13T00:26:19.177049945Z" level=info msg="StopPodSandbox for \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\" returns successfully" May 13 00:26:19.177609 containerd[1455]: time="2025-05-13T00:26:19.177577085Z" level=info msg="RemovePodSandbox for \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\"" May 13 00:26:19.177635 containerd[1455]: time="2025-05-13T00:26:19.177612852Z" level=info msg="Forcibly stopping sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\"" May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.212 [WARNING][5628] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0", GenerateName:"calico-apiserver-9bb7cb75b-", Namespace:"calico-apiserver", SelfLink:"", UID:"48db1284-6ae2-4942-9a9b-c26e0e5a874e", ResourceVersion:"942", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bb7cb75b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1388959ec56f0f66eefa5eb147f1e32b3263ac778d310ac7ec4f99603ced243f", Pod:"calico-apiserver-9bb7cb75b-6pgjr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3bdf2d9dae4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.212 [INFO][5628] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.212 [INFO][5628] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" iface="eth0" netns="" May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.212 [INFO][5628] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.212 [INFO][5628] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.232 [INFO][5637] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" HandleID="k8s-pod-network.e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.232 [INFO][5637] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.232 [INFO][5637] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.238 [WARNING][5637] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" HandleID="k8s-pod-network.e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.238 [INFO][5637] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" HandleID="k8s-pod-network.e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" Workload="localhost-k8s-calico--apiserver--9bb7cb75b--6pgjr-eth0" May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.239 [INFO][5637] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:19.244525 containerd[1455]: 2025-05-13 00:26:19.242 [INFO][5628] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f" May 13 00:26:19.244977 containerd[1455]: time="2025-05-13T00:26:19.244585593Z" level=info msg="TearDown network for sandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\" successfully" May 13 00:26:19.248588 containerd[1455]: time="2025-05-13T00:26:19.248510752Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." May 13 00:26:19.248588 containerd[1455]: time="2025-05-13T00:26:19.248569592Z" level=info msg="RemovePodSandbox \"e02071d641e1fae2896a18fd24a5d7fe803634b2c24cd5b0d44603f2f0dbbb5f\" returns successfully" May 13 00:26:19.249127 containerd[1455]: time="2025-05-13T00:26:19.249088466Z" level=info msg="StopPodSandbox for \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\"" May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.283 [WARNING][5660] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0", GenerateName:"calico-kube-controllers-79ff8699d-", Namespace:"calico-system", SelfLink:"", UID:"17288722-b2a9-43e7-9881-20dbd0f1ae77", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"79ff8699d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0", Pod:"calico-kube-controllers-79ff8699d-dbkcf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0948978a615", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.283 [INFO][5660] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.283 [INFO][5660] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" iface="eth0" netns="" May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.283 [INFO][5660] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.283 [INFO][5660] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.303 [INFO][5668] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" HandleID="k8s-pod-network.1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.303 [INFO][5668] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.303 [INFO][5668] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.308 [WARNING][5668] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" HandleID="k8s-pod-network.1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.308 [INFO][5668] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" HandleID="k8s-pod-network.1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.309 [INFO][5668] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:19.313969 containerd[1455]: 2025-05-13 00:26:19.311 [INFO][5660] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:26:19.314393 containerd[1455]: time="2025-05-13T00:26:19.314007382Z" level=info msg="TearDown network for sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\" successfully" May 13 00:26:19.314393 containerd[1455]: time="2025-05-13T00:26:19.314032790Z" level=info msg="StopPodSandbox for \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\" returns successfully" May 13 00:26:19.314570 containerd[1455]: time="2025-05-13T00:26:19.314517490Z" level=info msg="RemovePodSandbox for \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\"" May 13 00:26:19.314608 containerd[1455]: time="2025-05-13T00:26:19.314573525Z" level=info msg="Forcibly stopping sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\"" May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.347 [WARNING][5691] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0", GenerateName:"calico-kube-controllers-79ff8699d-", Namespace:"calico-system", SelfLink:"", UID:"17288722-b2a9-43e7-9881-20dbd0f1ae77", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"79ff8699d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9c1fd3267c6df1629d4ab1493249c5ea85662e1f6152550f1f8cb633be779df0", Pod:"calico-kube-controllers-79ff8699d-dbkcf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0948978a615", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.347 [INFO][5691] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.347 [INFO][5691] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" iface="eth0" netns="" May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.347 [INFO][5691] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.347 [INFO][5691] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.368 [INFO][5699] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" HandleID="k8s-pod-network.1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.368 [INFO][5699] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.368 [INFO][5699] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.373 [WARNING][5699] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" HandleID="k8s-pod-network.1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.373 [INFO][5699] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" HandleID="k8s-pod-network.1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" Workload="localhost-k8s-calico--kube--controllers--79ff8699d--dbkcf-eth0" May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.375 [INFO][5699] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 00:26:19.380250 containerd[1455]: 2025-05-13 00:26:19.377 [INFO][5691] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78" May 13 00:26:19.380250 containerd[1455]: time="2025-05-13T00:26:19.380223825Z" level=info msg="TearDown network for sandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\" successfully" May 13 00:26:19.384762 containerd[1455]: time="2025-05-13T00:26:19.384735274Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." May 13 00:26:19.384827 containerd[1455]: time="2025-05-13T00:26:19.384779797Z" level=info msg="RemovePodSandbox \"1ac65aa5f154b5c2f3511e4e6ff3578e9d371609fdea32aedafbf6d576dd4d78\" returns successfully" May 13 00:26:24.051560 systemd[1]: Started sshd@20-10.0.0.89:22-10.0.0.1:33398.service - OpenSSH per-connection server daemon (10.0.0.1:33398). May 13 00:26:24.088557 sshd[5710]: Accepted publickey for core from 10.0.0.1 port 33398 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:24.090601 sshd[5710]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:24.094928 systemd-logind[1436]: New session 21 of user core. May 13 00:26:24.110695 systemd[1]: Started session-21.scope - Session 21 of User core. May 13 00:26:24.226567 sshd[5710]: pam_unix(sshd:session): session closed for user core May 13 00:26:24.230934 systemd[1]: sshd@20-10.0.0.89:22-10.0.0.1:33398.service: Deactivated successfully. May 13 00:26:24.233261 systemd[1]: session-21.scope: Deactivated successfully. May 13 00:26:24.234066 systemd-logind[1436]: Session 21 logged out. Waiting for processes to exit. May 13 00:26:24.235034 systemd-logind[1436]: Removed session 21. May 13 00:26:25.943327 kubelet[2476]: E0513 00:26:25.943287 2476 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 00:26:29.239174 systemd[1]: Started sshd@21-10.0.0.89:22-10.0.0.1:48180.service - OpenSSH per-connection server daemon (10.0.0.1:48180). May 13 00:26:29.271695 sshd[5749]: Accepted publickey for core from 10.0.0.1 port 48180 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:29.273295 sshd[5749]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:29.277644 systemd-logind[1436]: New session 22 of user core. May 13 00:26:29.287740 systemd[1]: Started session-22.scope - Session 22 of User core. May 13 00:26:29.395036 sshd[5749]: pam_unix(sshd:session): session closed for user core May 13 00:26:29.398534 systemd[1]: sshd@21-10.0.0.89:22-10.0.0.1:48180.service: Deactivated successfully. May 13 00:26:29.400317 systemd[1]: session-22.scope: Deactivated successfully. May 13 00:26:29.400935 systemd-logind[1436]: Session 22 logged out. Waiting for processes to exit. May 13 00:26:29.401865 systemd-logind[1436]: Removed session 22. May 13 00:26:34.409574 systemd[1]: Started sshd@22-10.0.0.89:22-10.0.0.1:48196.service - OpenSSH per-connection server daemon (10.0.0.1:48196). May 13 00:26:34.442227 sshd[5764]: Accepted publickey for core from 10.0.0.1 port 48196 ssh2: RSA SHA256:C8EB+qIBpDYbEudkwL+hXgYkYPlLQFWTQCbVJQyY2dw May 13 00:26:34.443760 sshd[5764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 00:26:34.447798 systemd-logind[1436]: New session 23 of user core. May 13 00:26:34.459661 systemd[1]: Started session-23.scope - Session 23 of User core. May 13 00:26:34.566966 sshd[5764]: pam_unix(sshd:session): session closed for user core May 13 00:26:34.572135 systemd[1]: sshd@22-10.0.0.89:22-10.0.0.1:48196.service: Deactivated successfully. May 13 00:26:34.574167 systemd[1]: session-23.scope: Deactivated successfully. May 13 00:26:34.575001 systemd-logind[1436]: Session 23 logged out. Waiting for processes to exit. May 13 00:26:34.576099 systemd-logind[1436]: Removed session 23.