[molpro-user] weird issue with MOLPRO 2010.1 binaries

Gershom (Jan M.L.) Martin gershom at weizmann.ac.il
Wed Sep 29 16:41:58 BST 2010


I eventually got everything to work as well, except for the "factory"
binaries. The grid engine on our cluster is MOAB (commercial version of
Torque+MAUI), but SGE should work as well. Below are my installation notes.
Obviously, directory locations etc. are specific to our HPC cluster.
Throughout, the compilers and math library used are Intel 11.1 and MKL 10,
respectively.

Jan

*To use vanilla TCGMSG*
*
*
(1) download GA 4.3.2. Install inside MOLPRO directory and (on one line)

make TARGET=LINUX64 FC=ifort CC=icc

IMPORTANT: using TARGET=LINUX will generate executables that seem to work OK
for CCSD(T) but produce garbage for MRCI

(2)

/configure -icc -ifort -blaspath /usr/local/intel/mkl/10.2.2.025/lib/em64t-m
*pp -mppbase /home/comartin/sandbox/molpro2010.1-ga-tcgmsg/*ga-4-3-2*
-instroot
/home/comartin/apps/molpro2010.1-ifort-11.059-mkl-tcgmsg*
*
*
*make -j16 *
*
*
*(3) the resulting executables work fine interactively but not on PBS thanks
to the dual network (Gbit and InfiniBand). The following will make it work:
edit bin/molpro and look for these two lines*
*
*
*  PROCGRP="$hostfile"
 fi

then append the following block after them
#
echo JMdebug dirty hack for local two networks
sed 's/-ib0//g' <$hostfile >/tmp/editedhostfile.$$
cat /tmp/editedhostfile.$$ >$hostfile
echo hostfile is
cat $hostfile
echo JMdebug end hostfile
*
#
it should now work also in batch. Note that communication via sockets may be
suboptimal
*
*
*To use TCGMSG with IMPI (the Intel MPI shipping with the compilers):*

(1) download GA 4.3.2. Install inside MOLPRO directory and (on one line)

make TARGET=LINUX64 FC=ifort CC=icc MPI_LIB=/usr/local/intel/impi/
3.2.2.006/lib64 MPI_INCLUDE=/usr/local/intel/impi/3.2.2.006/include64USE_MPI=yes

IMPORTANT: using TARGET=LINUX will generate executables that work OK for
CCSD(T) but generate garbage for MRCI

(2) then (assuming you are in /home/comartin/sandbox/molpro2010.1-try3)

set path=(/usr/local/intel/impi/3.2.2.006/bin64 $path)

./configure -icc -ifort -blaspath
/usr/local/intel/mkl/10.2.2.025/lib/em64t-mpp -mppbase
/home/comartin/sandbox/molpro2010.1-try3/ga-4-3-2 -instroot
/home/comartin/apps/molpro2010.1-ifort-11.059-mkl-tcgmsg

make -j16

(3) the following script should work for a test job

#!/bin/tcsh
cd /home/comartin/sandbox/molpro2010.1-try3
echo RUNNING ON HOST $HOST
#setenv TCGRSH /usr/bin/ssh
set path=(/usr/local/intel/impi/3.2.2.006/bin64 $path)
setenv LD_LIBRARY_PATH /usr/local/intel/impi/
3.2.2.006/lib64:$LD_LIBRARY_PATH
/usr/local/intel/impi/3.2.2.006/bin64/mpd &
bin/molpro -v -n 4 -m 64M -d /scratch/$USER n2test.com


*To use TCGMSG with OpenMPI:*

for ga-4-3-2:

make TARGET=LINUX64 FC=mpif77 CC=mpicc MPI_LIB=/usr/local/openmpi/lib
MPI_BIN=/usr/local/openmpi/bin USE_MPI=yes

for molpro:

./configure -icc -ifort -blaspath
/usr/local/intel/mkl/10.2.2.025/lib/em64t-mpp -mppbase
/home/comartin/sandbox/molpro2010.1-ga-openmpi/ga-4-3-2
-instroot /home/comartin/apps/molpro2010.1-ifort-11.059-mkl-ga-openmpi

in order to run on some systems you may need to adjust a system parameter:

http://www.open-mpi.org/faq/?category=openfabrics#ib-locked-pages

On Tue, Sep 28, 2010 at 1:25 PM, Reuti <reuti at staff.uni-marburg.de> wrote:

> Hi,
>
> Am 27.09.2010 um 22:12 schrieb Tanja van Mourik:
>
> > Hi Jan and Manhui,
> >
> >>> I agree it's strange that it works interactively but not through PBS.
> >>> Perhaps to put this rsh vs. ssh stuff to rest you could try the
> following:
> >>>
> >>> 1)  put the following in your .cshrc file so that it is set on all your
> >>> nodes:   setenv RSHCOMMAND /usr/bin/ssh
> >>>
> >>> 2)  if that doesn't help, just make a soft link (on all nodes) that
> >>> points /usr/bin/rsh to /usr/bin/ssh
> >>>
> >>> Manhui:  why is rsh the default and not ssh for the parallel binaries?
> >>> No one uses rsh any more.....
> >> The current parallel molpro binaries are still built using GA with
> >> TCGMSG, which is very portable, and the default connection in TCGMSG is
> >> rsh. We are considering using GA with TCGMSG-MPI in future versions.
> >>
> >> After discussion with Jan, he has found the problem lies in that the HPC
> >> cluster has two kinds of networks, and is looking for a solution.
> >
> > Has a solution been found for this problem? I encounter something very
> similar
> > with SGE. We compiled molpro2009 and now installed the binary of
> molpro2010,
> > but both versions give the following error message when trying to run via
> SGE:
>
> I run both self compiled versions w/o problems inside SGE. But in all cases
> I used Open MPI (--with-sge) as transport (either native or for GA's
> TCGMSG). I used the Portland compilers though.
>
> -- Reuti
>
>
> > forrtl: severe (64): input conversion error, unit -5, file Internal
> Formatted
> > Read
> >
> > It works when running interactively on the same node with the same
> command as
> > used in the job. Setting "setenv RSHCOMMAND /usr/bin/ssh" and "setenv
> TCGRSH
> > /usr/bin/ssh" in .cshrc or in the job script does not make a difference.
> This
> > should anyway not be the problem, as I can rsh between nodes.
> >
> > Best wishes,
> >
> > Tanja
> > --
> >  =================================================================
> >   Tanja van Mourik
> >   Senior Lecturer in Chemistry
> >   School of Chemistry, University of St. Andrews
> >   North Haugh, St. Andrews, Fife KY16 9ST, Scotland (UK)
> >
> >   email: tanja.vanmourik at st-andrews.ac.uk
> >   web:   http://chemistry.st-and.ac.uk/staffmember.php?id=tvm
> >
> >   The University of St Andrews is a charity registered
> >   in Scotland: No SC013532
> >   =================================================================
> >>
> >
> >
> > ------------------------------------------------------------------
> > University of St Andrews Webmail: https://webmail.st-andrews.ac.uk
> >
> > _______________________________________________
> > Molpro-user mailing list
> > Molpro-user at molpro.net
> > http://www.molpro.net/mailman/listinfo/molpro-user
>
> _______________________________________________
> Molpro-user mailing list
> Molpro-user at molpro.net
> http://www.molpro.net/mailman/listinfo/molpro-user
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.molpro.net/pipermail/molpro-user/attachments/20100929/20c5e4a5/attachment.html>


More information about the Molpro-user mailing list