[molpro-user] Molpro 2006.1 runtime error

Jyh-Shyong Ho c00jsh00 at nchc.org.tw
Thu Jul 19 03:10:58 BST 2007


Dear Molpro users,

Recently I tried to install Molpro 2006.1 on our new IBM cluster,
after some efforts, I managed to finish the compilation of the 
program, but some runtime error occurs, so I come here and hope 
someone might have solved this problem.

Each node of the new cluster has 2 Intel Xeon 5160 cpus, 16GB RAM
and a Voltaire Infinband port, Intel Fortran compiler 9.1.049 and
Intel CMKL 9.0 library were used.

Here is the setup file I used to build the GA library:

export TARGET=LINUX64
export MPI_INCLUDE=/opt/vltmpi/OPENIB/mpi.icc.rsh/include
export MPI_LIB=/opt/vltmpi/OPENIB/mpi.icc.rsh/lib
export USE_MPI=yes
export BLAS_I8=yes
export BLAS_LIB="-L/opt/intel/cmkl/lib/em64t -lmkl_em64t -lguide -lpthread
-ldl -lsysfs"
export LIBMPI="-L/opt/vltmpi/OPENIB/mpi.icc.rsh/lib -lmpich -lfmpich
-L/usr/local/ofed/lib64 -libverbs"
export LARGE_FILES=TRUE
export ARMCI_NETWORK=OPENIB
export IB_INCLUDE=/usr/local/ofed/include
export IB_LIB=/usr/local/ofed/lib64
export PATH=${PATH}:/opt/vltmpi/OPENIB/mpi.icc.rsh/bin
export GA_C_CORE=yes
gmake TARGET=LINUX64 FC="ifort -no-ipo" CC="icc -no-ipo" CXX=icpc
---------------------------------------------end of file


Here is the CONFIG file I used to build Molpro 2006.1:

# MOLPRO CONFIG generated at Wed Jul 18 21:29:39 CST 2007 with version 2006.1
#               for host irish2, architecture unix unix-i8 unix-linux
unix-linux-x86_64
#
#  insert additional hosts before irish2 in above line, if desired
#
ARCHITECTURE="unix unix-i8 unix-linux unix-linux-x86_64 mpp"



# Compilers ..
CC="gcc -Difc -DI64 -m64"
FC="/opt/intel/fce/9.1.045/bin/ifort -pc 64 -auto -autodouble -Vaxlib -i8"
F90="/opt/intel/fce/9.1.045/bin/ifort -pc 64 -auto -autodouble -Vaxlib -i8"
# compiler command to be used only when linking molpro.exe .. eg mpxlf on IBM SP
LINKFC="/opt/intel/fce/9.1.045/bin/ifort -pc 64 -auto -autodouble -Vaxlib -i8"
YACC="bison -b y"
ARFLAGS="-r"
PERL="1"
PERLEXE="/usr/bin/perl"
PERLVERSION="5.008003"
# C defines
# nb    SEEK is disk seek speed in seconds
#       SPEED is disk transfer rate in 8 bytes/second
# compiler optimisation
COPT="-O3"
FOPT="-w -O3"
# compiler explicit no optimisation
CDEF="-DSEEK=.001 -DSPEED=200000   -DLARGEFILES -DUSE_MPI=mpich
-DMPIGM_FORMAT=1 -DZLIB -DHAVE_INTTYPES_H   -DMA_ALLOC -DMOLPROC_PAR"
CNOPT=" "
COPT1="-O1"COPT2="-O2"
FNOPT="-O0"
FOPT1="-w -O1"
FOPT2="-w -O2"
FOPT3="-w -O3"
# compiler debug flag
CDEBUG="-g"
FDEBUG="-g"
# static variables
FSTATIC=" "
# 64-bit integers
FI64=" "
# profiling
FPROFILE="-p"
CPROFILE="-p"
# additional pre-processor flags
FTCFLAGS="mpp eaf blas1 blas2 blas3 lapack"
# additional directories to be compiled in mpp case
MPPDIR="mpp"
SLATERDIR=""
# additional libraries and link options
LIBS="  -lz"
LINKOPT="-Wl,-rpath,/opt/intel/fce/9.1.045/lib -openmp"
# non-standard location of system libraries
LIBDIR="/package/chem/workdir/molpro2k6.1.p/gaib/lib/LINUX64"
# GKS X-windows library
GKSLIB="-lgks0"
# BLAS library
BLASLIB="-L/opt/intel/cmkl/lib/em64t -lmkl_em64t
-Wl,-rpath,/opt/intel/cmkl/lib/em64t"
# LAPACK libraryLAPACKLIB="-L/opt/intel/cmkl/lib/em64t -lmkl_lapack
-Wl,-rpath,/opt/intel/cmkl/lib/em64t"
# MPP?
MPP="1"
MPPX="0"
SLATER=0
WRAPPER="/opt/vltmpi/OPENIB/mpi.icc.rsh/bin/mpirun_ssh "
USE_MPI="1"
USE_LAPI=""
USE_MYRINET=""
MPI_LIB="-L/opt/vltmpi/OPENIB/mpi.icc.rsh/lib -lmpich"
MPPNAME="mpi"
ARCHNAME="x86_64"
MPITYPE="mpich"
MPIBASEDIR="/opt/vltmpi/OPENIB/mpi.icc.rsh"
MYRINET_LIB=""
MYRINET_LINKPARS=""
MPIGM_FORMAT="1"
USE_GIO="0"
USE_GATOOLS="0"
# files in src/ to be compiled without optimization
F77VERSION="Intel(R) Fortran Compiler for Intel(R) EM64T-based applications,
Version 9.1    Build 20070320 Package ID: l_fc_c_9.1.045"
FORTVERSION="ifort9.1"
NO_OPT="blas/lapack0.f"
F_OPT1=" "
F_OPT2=" "
F_OPT3=" "C_OPT1=" "
C_OPT2=" "
# ranlib command (if not needed, true)
RANLIB="true"
# ls command to get user and group
LSFULL="ls -l"
# cp -p if it exists, else cp
CP="cp -p"
# tar command
TAR="tar"
# awk command
AWK="awk"
# strip command
STRIP="strip"
# installation directories
INSTBIN="/package/chem/molpro/binp"
INSTLIB="/package/chem/molpro/libp"
INSTHTML="/u1/chem/public_html/molpro/molpro2006.1"
INSTCGI="/u1/chem/public_html/molpro/molpro2006.1"
# makedepend configuration
MAKEDEPEND_OPTIONS=""
MODULE_SUFFIX="mod"
MODULE_FLAG="-I"
# non-default libraries
LIBDIR_mpi=/package/chem/workdir/molpro2k6.1.p/gaib/lib/LINUX64
BLASLIB_x86_64="-L/opt/intel/cmkl/lib/em64t -lmkl_em64t
-Wl,-rpath,/opt/intel/cmkl/lib/em64t"
LAPACKLIB_x86_64="-L/opt/intel/cmkl/lib/em64t -lmkl_lapack
-Wl,-rpath,/opt/intel/cmkl/lib/em64t"
WRAPPER_mpi=/opt/vltmpi/OPENIB/mpi.icc.rsh/bin/mpirun_ssh
PARSE=parse-x86_64-unknown-linux-gnu-i8.o.gz
 
--------------------------------------------------------------------end of file
 
In order to properly link libraries, I modified file bin/linkprog:

...
LIBS="-L$lib -l$progname_base $galib $GKSLIB $LAPACKLIB $BLASLIB  -lpthread
-ldl -lsysfs  \
      -L/opt/vltmpi/OPENIB/mpi.icc.rsh/lib -lmpich -L/usr/local/ofed/lib64
-libverbs"
... 

Our cluser system uses IBM Loadleveler queue system, In addition, Voltaire
vltmpi's
command for running parallel job is

mpirun_ssh -hostfile file_name_of_assigned_host -np no.of.cores executable
inputfile

File utilities/molpro.c is modified, replace -machinefile with -hostfile and
mpirun-machinefile with mpirun-hostfile:

diff molpro.c molpro.c.orig

39c39
< #define MACHINEFILEOPTION "-hostfile"
---
> #define MACHINEFILEOPTION "-machinefile"
930c930
<         {"mpirun-hostfile",1,0,0},
---
>         {"mpirun-machinefile",1,0,0},

And use the loadleveler job script to submit Molpro job:

#!/bin/sh -f
# @ job_type = parallel
# @ output = h2o_xyzinput.out
# @ error  = h2o_xyzinput.err
# @ node = 2
# @ tasks_per_node = 4
# @ node_usage = not_shared
# @ class = 8cpu
# @ queue

mkdir -p /scratch/chem

echo $LOADL_PROCESSOR_LIST|tr " " "\n"> /package/chem/molpro/examples/hosts
export HOSTFILE=/package/chem/molpro/examples/hosts
export PBS_NODEFILE=/package/chem/molpro/examples/hosts
source /package/chem/molpro/setmolpro
export SCRATCH=/scratch/chem

echo 'The Molpro2006.1 job begins at:' ` date`

/package/chem/molpro/binp/molpro -n8 -d/scratch/chem h2o_xyzinput.com

wait
echo 'The Molpro2006.1 job completes at:' ` date`
#rm -rf /scratch/chem

exit
-------------------------------------end of file
Variable LOADL_PROCESSOR_LIST returns a line of assigned node names, 
seperated by a blank space. File $HERE/hosts contains a list of assigned 
node names, one node name per line. Variable PBS_NODEFILE points to this 
host file. The loadleveler on our IBM cluster has no LOADL_HOSTFILE variable,
so I have to use variable LOADL_PROCESSOR_LIST and define variable
PBS_NODEFILE so molpro.c can work properly.

Everything looks fine, the compilation completed without error, and
with the job script, Molpro test job can be submitted successfully.

Here comes the runtime error:


File JOB.out

ARMCI configured for 2 cluster nodes. Network protocol is 'OpenIB verbs'.

 MPP nodes  nproc
 iris007      4
 iris008      4
 ga_uses_ma=false, calling ma_init with nominal heap.
 GA-space will be limited to   0.0 words (determined by -G option)
7:7:nga_create_config:ga_set_data:ttype not yet supported :: 1010
5:5:nga_create_config:ga_set_data:ttype not yet supported :: 1010
6:6:nga_create_config:ga_set_data:ttype not yet supported :: 1010
4:4:nga_create_config:ga_set_data:ttype not yet supported :: 1010

---------------------------------------------------------end of file
Here is the file molpro.rc

-G 8000000 # MA memory
-d/scratch/$USER    # directory in which the program should be run
-I/scratch/$USER    # directory to store permanent copies of int files (1)
-W$HOME/wfu    # directory to store permanent copies of wf files (2,3)
-L/package/chem/workdir/molpro2k6.1.p/lib/    # directory containing library files
-x/package/chem/workdir/molpro2k6.1.p/bin/molprop_2006_1_i8_x86_64_mpi.exe    # ex
ecutable
-m7272727    # default memory in 8-byte words
-K524288    # mxma cache size in 8-byte words
-B64    # mxma blocking size
--mpi # determines procgroup file format
-l/opt/vltmpi/OPENIB/mpi.icc.rsh/bin/mpirun_ssh # program run wrapper
--environment MOLPRO_COMPILER=ifort9.1
--environment MOLPRO_COMPILER=ifort9.1
--environment MOLPRO_BLASLIB=mkl_em64t
---------------------------------------------------------end of file

It seems that my GA library does not behave correctly.

Thanks for any suggestion to solve this problem.


Jyh-Shyong Ho, Ph.D.
Research Scientist
National Center for High Performance Computing
Hsinchu, Taiwan, ROC
--
National Center High-performance Computing (http://www.nchc.org.tw/)




More information about the Molpro-user mailing list