Skip to Content.
Sympa Menu

cado-nfs - Re: [Cado-nfs-discuss] Linear Algebra with MPI

Subject: Discussion related to cado-nfs

List archive

Re: [Cado-nfs-discuss] Linear Algebra with MPI


Chronological Thread 
  • From: Pierpaolo Santucci <santucci.pierpaolo@gmail.com>
  • To: Emmanuel Thomé <emmanuel.thome@gmail.com>
  • Cc: cado-nfs-discuss@lists.gforge.inria.fr
  • Subject: Re: [Cado-nfs-discuss] Linear Algebra with MPI
  • Date: Thu, 14 Aug 2014 20:00:04 +0200
  • List-archive: <http://lists.gforge.inria.fr/pipermail/cado-nfs-discuss/>
  • List-id: A discussion list for Cado-NFS <cado-nfs-discuss.lists.gforge.inria.fr>

Thanks for the quick response.

In my local.sh there is:

MPI=/home/user/mpich-3.1.2/

Other steps of the GNFS (e.g. Polynomial Selection or Sieving) use MPI correctly.

Regards,

Pierpaolo Santucci



On 14 August 2014 19:47, Emmanuel Thomé <emmanuel.thome@gmail.com> wrote:

Most probably you haven't compiled cado-nfs for mpi use. There is an influential MPI variable which can be set from local.sh .

Please report how far you can go. I have little to no internet connection at all at the moment, but I'll try to help to the extent possible.

E.

Le 14 août 2014 19:20, "Pierpaolo Santucci" <santucci.pierpaolo@gmail.com> a écrit :
Hello everyone!

I'm interested in CADO-NFS 2.1 linear algebra. I want to execute the Block Wiedemann Algorithm on several hosts via MPI (mpich-3.1.2) but the command:

./bwc.pl :complete mpi=2 hosts=host1,host2 thr=4 mn=64 nullspace=left interval=100 matrix=/tmp/cado.XXXXBy837j/c90.merge.small.bin wdir=/tmp/tmp interleaving=0 shuffled_product=1

gives me the following error:

Inconsistency -- exactly 2 == 1 * 2 MPI jobs are needed -- got 1
./dispatch: exited with status 1

What's wrong?

Thanking you in advance.
Regards,

Pierpaolo Santucci

_______________________________________________
Cado-nfs-discuss mailing list
Cado-nfs-discuss@lists.gforge.inria.fr
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/cado-nfs-discuss





Archive powered by MHonArc 2.6.19+.

Top of Page