gfortran + mpich-3.0.4

General scientific issues regarding ROMS

Moderators: arango, robertson

Post Reply
Message
Author
nana

gfortran + mpich-3.0.4

#1 Unread post by nana »

in build.bash script i have a part in gfortran case which i am going to use but as i use mpich-3.0.4 and not mpich2 or openmpi , do i have to add a part about mpich in the gfortran part ?


Code: Select all

gfortran )
      export             ESMF_OS=Linux
      export       ESMF_COMPILER=gfortran
      export           ESMF_BOPT=O
      export            ESMF_ABI=64
      export           ESMF_COMM=mpich
      export           ESMF_SITE=default

      export       ARPACK_LIBDIR=/opt/gfortransoft/serial/ARPACK
      if [ -n "${USE_MPI:+1}" ]; then
        if [ "${which_MPI}" = "mpich2" ]; then
          export        ESMF_DIR=/opt/gfortransoft/mpich2/esmf
          export      MCT_INCDIR=/opt/gfortransoft/mpich2/mct/include
          export      MCT_LIBDIR=/opt/gfortransoft/mpich2/mct/lib
          export  PARPACK_LIBDIR=/opt/gfortransoft/mpich2/PARPACK
        elif [ "${which_MPI}" = "openmpi" ]; then
          export        ESMF_DIR=/opt/gfortransoft/openmpi/esmf
          export      MCT_INCDIR=/opt/gfortransoft/openmpi/mct/include
          export      MCT_LIBDIR=/opt/gfortransoft/openmpi/mct/lib
          export  PARPACK_LIBDIR=/opt/gfortransoft/openmpi/PARPACK
        fi
      fi

User avatar
kate
Posts: 4091
Joined: Wed Jul 02, 2003 5:29 pm
Location: CFOS/UAF, USA

Re: gfortran + mpich-3.0.4

#2 Unread post by kate »

If you are only using the ocean model, you shouldn't need any of those libraries, so no, you shouldn't need to do anything. Does its compile script get called "mpif90"?

nana

Re: gfortran + mpich-3.0.4

#3 Unread post by nana »

Hi dear Kate
thank you for your reply and i am sorry that i didn't check here for a while .
this is what i made enabled in coawst.bash :

Code: Select all

export           USE_MPI=on            # distributed-memory parallelism
 export        USE_MPIF90=on              # compile with mpif90 script
 export         which_MPI=mpich         # compile with MPICH library
#export         which_MPI=mpich2        # compile with MPICH2 library
# export         which_MPI=openmpi       # compile with OpenMPI library
and i want to run 3 models(ocean+wave+atm) , so i need mpich . but after the compilation end successfully and it gives me the coaswtM and i run it by this command :

Code: Select all

mpiexec –np 2 ./coawstM ./Projects/g/ swan_only.in
i get this error which i don't know what it is ! it is my first time running a mip command in my machine
i don't know if there is because i use virtual machine or anything else ? please help me solve it .
regards
[proxy:0:0@nazanin-VirtualBox] HYDU_create_process (./utils/launch/launch.c:75): execvp error on file –np (No such file or directory)

===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= EXIT CODE: 255
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================

User avatar
kate
Posts: 4091
Joined: Wed Jul 02, 2003 5:29 pm
Location: CFOS/UAF, USA

Re: gfortran + mpich-3.0.4

#4 Unread post by kate »

You might need some of the MCT flags set in your case. But you say it successfully built coawstM, so no worries.
mpiexec –np 2 ./coawstM ./Projects/g/ swan_only.in
It's complaining about the '-np' argument. Reading the man page for mpiexec, it should instead be '-n numproc'. You should not have a space here: '/g/ swan_only.in' if it is all part of the path to the input file.

nana

Re: gfortran + mpich-3.0.4

#5 Unread post by nana »

hi dear Kate and thank you for your reply .
now i run this command :

Code: Select all

mpirun -np 2 ./coawstM   swan_only.in
and it give me this error , which i don't know it is because of a mal function in my coawstM , or this is some issue in mpich , or MCT or some hardware issue or anything else
nazanin@nazanin-VirtualBox:~/coawst/SOURCE/coawst/Projects/g$ mpirun -np 2 ./coawstM swan_only.in
Fatal error in PMPI_Comm_rank: Invalid communicator, error stack:
PMPI_Comm_rank(108): MPI_Comm_rank(comm=0x0, rank=0x7fff24a405bc) failed
PMPI_Comm_rank(66).: Invalid communicator
Fatal error in PMPI_Comm_rank: Invalid communicator, error stack:
PMPI_Comm_rank(108): MPI_Comm_rank(comm=0x0, rank=0x7fff63e1598c) failed
PMPI_Comm_rank(66).: Invalid communicator

===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= EXIT CODE: 1
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================

Post Reply