Subject: Discussion related to cado-nfs
List archive
- From: Emmanuel Thomé <Emmanuel.Thome@inria.fr>
- To: Barbulescu Razvan <razvan.barbulescu@u-bordeaux.fr>
- Cc: cado-nfs@inria.fr
- Subject: Re: [cado-nfs] compile cado on a cluster
- Date: Tue, 7 Feb 2023 16:34:43 +0100
- Authentication-results: mail3-relais-sop.national.inria.fr; dkim=none (message not signed) header.i=none; spf=SoftFail smtp.mailfrom=Emmanuel.Thome@inria.fr; dmarc=fail (p=none dis=none) d=inria.fr
Hi Razvan,
So far, the approach that I've used when testing on that particular
cluster has been to add the pre-configuration stuff to the local.sh file ;
it's a bit ugly, but it works. It's also justified by the fact that the
module setup there has not always been super consistent.
Here's what I have:
if ! type -p module && [ -f /etc/profile.d/modules.sh ] ; then
. /etc/profile.d/modules.sh
fi
module purge
module load build/cmake
module load compiler/gcc
This should get you going. I tested it a minute ago (with the trivial fix
that I just pushed) and it worked.
Note that this relies on the fact that gmp, hwloc, and python are present
on the system. While this seems to be the fact on the particular node
that I tested, it may not be true for all of them, so your mileage may
vary. Be especially cautious with the fact that compute nodes on this
cluster seem to be running a venerable centos 7 distribution, which is
probably not the ideal setting if you want to avoid bad surprises.
Cheers,
E.
On Tue, Feb 07, 2023 at 04:07:16PM +0100, Barbulescu Razvan wrote:
> Hello,
>
> I want to use cado on a small cluster called plafrim (in Bordeaux). In order
> to use binaries like cmake, gcc and gmp I have to load the modules like
> "module load build/cmake/3.15.3"
>
> Do I have to load a specific module for g++ ? I don't see any such module on
> my cluster. Or maybe it is included in gcc.
>
> How do I know that I have all the prerequisites ?
>
> Please find the output of make in "cado-bug" and the modules list in
> "modules".
>
> cheers,
>
> Razvan
> [rbarbule@zonda01 cado-nfs]$ make
> [ 0%] Building gf2x
> Making all in lowlevel
> Making all in src
> Making all in .
> Making all in fft
> Making all in tests
> [ 0%] Built target gf2x-build
> [ 0%] Built target antebuffer
> [ 0%] Generating list of modified files in working tree
> [ 0%] Built target list_modified_files
> [ 7%] Built target utils
> [ 8%] Built target utils_with_io
> [ 10%] Built target numbertheory_tool
> [ 10%] Built target polyselect_common
> [ 10%] Built target dlpolyselect
> [ 11%] Built target polyselect_middle
> [ 11%] Built target polyselect
> [ 11%] Built target rotate_all
> [ 13%] Built target sopt
> [ 14%] Built target polyselect_ropt
> [ 15%] Built target polyselect_gfpn
> [ 15%] Built target polyselect3
> [ 15%] Built target rotate
> [ 15%] Building CXX object
> sieve/CMakeFiles/las_core_b0.dir/las-siever-config.cpp.o
> In file included from
> /home/rbarbule/cado-nfs/sieve/las-siever-config.cpp:11:
> In file included from
> /home/rbarbule/cado-nfs/sieve/./las-siever-config.hpp:8:
> In file included from /home/rbarbule/cado-nfs/sieve/./fb.hpp:29:
> /home/rbarbule/cado-nfs/utils/mmappable_vector.hpp:20:50: error: no
> template named 'is_trivially_constructible' in namespace 'std'; did you
> mean 'is_trivially_destructible'?
> struct works_with_mmappable_vector : public
> std::is_trivially_constructible<T>
> ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~
> is_trivially_destructible
> /usr/lib/gcc/x86_64-redhat-linux/4.8.5/../../../../include/c++/4.8.5/type_traits:1204:12:
> note: 'is_trivially_destructible' declared here
> struct is_trivially_destructible
> ^
> 1 error generated.
> make[2]: *** [sieve/CMakeFiles/las_core_b0.dir/las-siever-config.cpp.o]
> Erreur 1
> make[1]: *** [sieve/CMakeFiles/las_core_b0.dir/all] Erreur 2
> make: *** [all] Erreur 2
> make: *** [all] Erreur 1
> build/ac269-am114-lt246-m41417/ac269-am114-lt246-m41417
> build/bison/3.3
> build/cmake/3.21.3
> build/conda/4.08.2
> build/conda/4.08.3
> build/conda/4.10
> build/help2man/1.47.13
> build/imake/1.0.8
> build/makedepend/1.0.6
> build/ninja/1.11
> compiler/aocc/2.1.0
> compiler/cuda/10.0
> compiler/cuda/10.2
> compiler/cuda/11.2
> compiler/cuda/11.3
> compiler/cuda/11.4
> compiler/cuda/11.6
> compiler/gcc/10.1.0
> compiler/gcc/10.2.0
> compiler/gcc/11.2.0
> compiler/gcc/9.3.0
> compiler/intel/2019_update4
> compiler/intel/2020_update4
> dnn/cudnn/10.0-v7.5.0
> dnn/cudnn/10.0-v7.6.4.38
> dnn/cudnn/11.2-v8.1.1.33
> dnn/cudnn/9.0-v7.0
> dnn/cudnn/9.0-v7.1
> dnn/tensorRT/7.0.0.11
> editor/jupyterlab/3.1.0rc1
> editor/nano/4.6
> editor/vim/8.2.0
> formal/pari/openmpi/2.13.0
> formal/pari/openmpi/2.13.3
> formal/pari/openmpi/2.13.4
> formal/pari/openmpi/2.15.1
> formal/pari/pthread/2.13.0
> formal/pari/pthread/2.13.3
> formal/pari/pthread/2.13.4
> formal/pari/pthread/2.15.1
> formal/sage/7.0
> formal/sage/8.9
> hardware/hwloc/1.11.13
> hardware/hwloc/2.6.0
> hardware/hwloc/2.7.0
> hardware/hwloc/2.9.0
> hardware/libpciaccess/0.16
> io/hdf5/nompi/1.10.5
> io/hdf5/openmpi@4.0.2/1.10.5
> language/java/jdk-17.0.2
> language/java/jre1.8.0_321
> language/julia/0.6.0
> language/julia/1.0.0
> language/julia/1.2.0
> language/julia/1.6.1
> language/julia/1.7.2
> language/python/3.5.9
> language/python/3.8.0
> language/python/3.8.0_shared
> language/python/3.9
> language/python/intel/3.6
> language/python/intel-3.6.0
> language/python-keras/3.6.9
> linalg/amd-blis-aocc
> linalg/amd-blis-gcc
> linalg/amd-libflame-aocc
> linalg/amd-libflame-gcc
> linalg/blis/int32-openmp/2.1/blis
> linalg/blis/int64-openmp/2.1/blis
> linalg/chameleon-hmat/mpi
> linalg/chameleon-hmat/mpi-fxt
> linalg/cutlass/2.10
> linalg/eigen/3.3.9
> linalg/lapacke/3.9.0
> linalg/lapacke/3.9.1
> linalg/lapack_reference/3.9.0
> linalg/mkl/2019_update4
> linalg/mkl/2020_update4
> linalg/mumps/5.2.1
> linalg/mumps/int32/5.4.1
> linalg/openblas/0.3.9
> linalg/petsc/3.12.2
> linalg/plasma/2.8.0
> linalg/scalapack/2.0.2
> mesh/medit/2.3
> mesh/mmg/5.0.0
> mesh/tetgen/1.6.0
> mesh/triangle/1.6
> mpi/intel/2019.4.243
> mpi/openmpi/2.0.4
> mpi/openmpi/3.1.4
> mpi/openmpi/4.0.1
> mpi/openmpi/4.0.1-intel
> mpi/openmpi/4.0.2
> mpi/openmpi/4.0.2-testing
> mpi/openmpi/4.0.3
> mpi/openmpi/4.0.3-mlx
> mpi/openmpi/4.1.1
> partitioning/bitpit/1.7.1
> partitioning/metis/int32/5.1.0
> partitioning/metis/int64/5.1.0
> partitioning/pampa/2.0
> partitioning/parmetis/4.0.3-nvhpc_223
> partitioning/scotch/int32/5.1.12
> partitioning/scotch/int32/6.0.9
> partitioning/scotch/int32/6.1.0
> partitioning/scotch/int32/6.1.1
> partitioning/scotch/int32/7.0.1
> partitioning/scotch/int32-seq/7.0.1
> partitioning/scotch/int64/5.1.12
> partitioning/scotch/int64/6.0.9
> partitioning/scotch/int64/6.1.0
> partitioning/scotch/int64/6.1.1
> partitioning/scotch/int64/7.0.1
> partitioning/scotch/int64-seq/7.0.1
> partitioning/scotch_with_esmumps/6.0.7
> perftools/simgrid/3.24
> perftools/simgrid/3.28
> perftools/simgrid/3.30
> physics/aerosol-deps
> physics/geosx_deps/mkl2020_gcc9.3.0_mpi4.0.3-Release-160522
> physics/geosx_deps/mkl2020_gcc9.3.0_mpi4.0.3-Release-f33507c31d
> physics/geosx_deps/new_module_nocuda
> physics/pygeosx/mkl2020_gcc9.3.0_mpi4.0.3
> runtime/parsec/master/mpi
> runtime/parsec/master/mpi-cuda
> runtime/parsec/master/shm
> runtime/parsec/master/shm-cuda
> runtime/parsec/master/shm-trace
> runtime/quark/0.9.1
> runtime/starpu/1.3.2/mpi
> runtime/starpu/1.3.2/mpi-fxt
> runtime/starpu/1.3.3/mpi
> runtime/starpu/1.3.3/mpi-cuda
> runtime/starpu/1.3.3/mpi-cuda-fxt
> runtime/starpu/1.3.3/mpi-fxt
> runtime/starpu/1.3.4/mpi
> runtime/starpu/1.3.4/mpi-cuda
> runtime/starpu/1.3.4/mpi-cuda-fxt
> runtime/starpu/1.3.4/mpi-fxt
> runtime/starpu/1.3.7/mpi
> runtime/starpu/1.3.7/mpi-cuda
> runtime/starpu/1.3.7/mpi-cuda-fxt
> runtime/starpu/1.3.7/mpi-fxt
> runtime/starpu/1.3.8/mpi
> runtime/starpu/1.3.8/mpi-cuda
> runtime/starpu/1.3.8/mpi-cuda-fxt
> runtime/starpu/1.3.8/mpi-fxt
> runtime/starpu/1.3.9/gcc@11.2.0-hwloc@2.7.0-openmpi@4.0.3-cuda@11.6-fxt@0.3.14
> runtime/starpu/master/gcc@11.2.0-hwloc@2.7.0-openmpi@4.0.3-cuda@11.6-fxt@0.3.14
> toolchains/llvm/13.0.1
> toolchains/llvm/14.0.6
> toolchains/llvm/15.0.1
> tools/argo/v3.3.1
> tools/bapcodframework/0test-0.5.5f
> tools/bapcodframework/2.0.0
> tools/bapcodframework/last
> tools/bapcodframework/last-master
> tools/bapcodframework/last-Niteroi
> tools/bapcodframework/last-Rcsp
> tools/bapcodframework/last-Z-master-fusion
> tools/boost/1.56.0-gcc-9.2.0
> tools/boost/1.71.0
> tools/boost/1.76.0-gcc-9.2.0
> tools/coin-or/2021.07.13
> tools/cplex/12.10.0
> tools/cplex/12.8.0
> tools/cplex/12.9.0
> tools/cplex/20.1.0
> tools/cromwell/77
> tools/ctags/p5.9.20220306.0
> tools/ddt/18.3
> tools/ddt/19.1.4
> tools/ddt/20.0.3
> tools/doxygen/1.9.5
> tools/fmt/9.1.1-gcc@12.2
> tools/gdal/3.3.3
> tools/git/2.30.0
> tools/git/2.36.0
> tools/gitlab-ci
> tools/gitlab-runner/1.11.2
> tools/gitlab-runner/12.0.2
> tools/gitlab-runner/14.7.0
> tools/gitlab-runner/9.0.2
> tools/git-lfs/2.9.0
> tools/gurobi/8.1.1
> tools/gurobi/9.1.1
> tools/gurobi/9.1.2
> tools/gurobi/9.5.0
> tools/irods/3.3.1
> tools/lemon/1.3.1
> tools/matlab/MCR_R2019a
> tools/matlab/MCR_R2019b
> tools/matlab/R2019
> tools/matlab/R2019a
> tools/module_cat/1.0.0
> tools/monolix/2019R2
> tools/monolix/2020R1
> tools/monolix/2021R1
> tools/monolix/2021R2
> tools/neos/1.0.0
> tools/nvhpc/2021_217
> tools/nvhpc/2022_223
> tools/oc/4.10.0-0.okd-2022-03-07-131213
> tools/pachctl/v2.1.4
> tools/parcoach/2.0.0
> tools/proj/8.1.0
> tools/protobuf/2.5.0
> tools/R/3.6.2
> tools/R/4.2.1
> tools/recutils/1.8
> tools/sasl/2.1.27
> tools/socat/2.0.0-b8
> tools/sqlite3/3.36.0
> tools/sync_clocks/mpi_sync_clocks
> tools/tiff/4.3.0
> tools/toil/5.3.0
> tools/toil/5.4.0
> tools/trace/likwid-4.0.3
> tools/trace/likwid/4.0.3
> tools/trace/likwid/4.0.3-amd
> tools/trace/papi/5.0.1
> tools/trace/papi/5.0.1-amd
> tools/trace/uprof
> tools/tree/1.8.0
> tools/udocker/1.2.7
> trace/energy_scope/1.0
> trace/energy_scope/1.1
> trace/energy_scope/1.2
> trace/energy_scope/1.4
> trace/energy_scope/1.5
> trace/energy_scope/1.7
> trace/eztrace/1.1-8
> trace/eztrace/1.1-9
> trace/eztrace/svn
> trace/fxt/0.3.11
> trace/fxt/0.3.12
> trace/fxt/0.3.13
> trace/fxt/0.3.14
> trace/fxt/0.3.9
> visu/feh/3.7.2
> visu/gnuplot/5.5
> visu/srun/0.0.1
>
--
Pour une évaluation indépendante, transparente et rigoureuse !
Je suis à la CE de l'Inria pour y apporter ma contribution.
- [cado-nfs] compile cado on a cluster, Barbulescu Razvan, 02/07/2023
- Re: [cado-nfs] compile cado on a cluster, Paul Zimmermann, 02/07/2023
- Re: [cado-nfs] compile cado on a cluster, Barbulescu Razvan, 02/07/2023
- Re: [cado-nfs] compile cado on a cluster, Emmanuel Thomé, 02/07/2023
- Re: [cado-nfs] compile cado on a cluster, Emmanuel Thomé, 02/07/2023
- Re: [cado-nfs] compile cado on a cluster, Barbulescu Razvan, 02/07/2023
- Re: [cado-nfs] compile cado on a cluster, Emmanuel Thomé, 02/07/2023
- Re: [cado-nfs] compile cado on a cluster, Andreas Enge, 02/08/2023
- Re: [cado-nfs] compile cado on a cluster, Paul Zimmermann, 02/07/2023
Archive powered by MHonArc 2.6.19+.