[molpro-user] Problems with a large eom-ccsd calculation

Evgeniy Gromov Evgeniy.Gromov at pci.uni-heidelberg.de
Sat Apr 23 11:04:45 BST 2011


Dear Tatiana,

Thank you very much for your response.
Just for your information the same job ran fine with the conventional
(not direct) EOM-CCSD procedure. But it took quite long, longer than it
would be if it were run in the direct mode.

Best wishes,
Evgeniy

Tatiana Korona wrote:
> Dear Evgeniy,
> 
> EOM-CCSD needs more memory for intermediates than the ground state CCSD, 
> but it is suspicious that it needs so much in your case. I will 
> investigate this. Sorry for the delayed answer.
> 
> Best wishes,
> 
> Tatiana
> 
> On Mon, 11 Apr 2011, Evgeniy Gromov wrote:
> 
>> Dear Manhui,
>>
>> The calculation crashed again at the same spot as before, i.e.
>> directly after finishing the ccsd step. This time there was
>> an error message (in the err file):
>>
>> Last System Error Message from Task 8:: Cannot allocate memory
>> application called MPI_Abort(MPI_COMM_WORLD, -1332675796) - process 9
>>
>> This time I set up memory to 1600m (16gb is per node) and 1 process
>> per node. Just in case, attached are the input and output files.
>>
>> Best regards,
>> Evgeniy
>>
>> Manhui Wang wrote:
>>> Dear Evgeniy,
>>>
>>> Have you built a working  Molpro2010.1 with GA 5-0-2 now?  Does your job
>>> work with the GA-Molpro?
>>>
>>> Best wishes,
>>> Manhui
>>>
>>> Manhui Wang wrote:
>>>> Dear Evgeniy,
>>>>
>>>> Just remind you that when configure GA
>>>>
>>>>       ./configure  --with-tcgmsg --with-mpi="-ldl
>>>> -I/software/intel/mpi/4.0.0.025/intel64/include
>>>> -L/software/intel/mpi/4.0.0.025/intel64/lib -lmpi -lmpigf -lmpigi 
>>>> -lpthread
>>>> -lrt" --with-openib="/usr/lib64 -libverbs -I/usr/include/infiniband"
>>>> F77=ifort CC=icc CXX=icpc  --prefix=${GA_path}-install
>>>>
>>>> the content of --with-mpi=" " comes from MPI link options (eg, mpiifort
>>>> -show, mpiifort -link_info, mpif90 -show), and you might have to 
>>>> replace
>>>> this with your own's .
>>>>
>>>> Best wishes,
>>>> Manhui
>>>>
>>>>
>>>>
>>>> Evgeniy Gromov wrote:
>>>>> Manhui,
>>>>>
>>>>> Thanks for your help.
>>>>> Ok, molpro configure finally passed. The problem was
>>>>> that I followed the manual where it is said that ga_libs
>>>>> should be in lib/LINUX64 (new ga put libs in /ga-path/lib.)
>>>>> Well, I did't recompile the whole code, only the libraries.
>>>>> And the new binary still does not work. I will try to recompile
>>>>> the code completely. Maybe it helps.
>>>>>
>>>>> Best wishes,
>>>>> Evgeniy
>>>>>
>>>>> Manhui Wang wrote:
>>>>>> Dear Evgeniy,
>>>>>>
>>>>>> Evgeniy Gromov wrote:
>>>>>>> Dear Manhui,
>>>>>>>
>>>>>>> GA runs OK (I did test.x for 16 CPUs).
>>>>>>> To run molpro configure in the fresh molpro2010.1
>>>>>>> I use your example (with the corrected paths):
>>>>>>>
>>>>>>> ./configure -batch -icc -ifort -mpp -mppbase
>>>>>>> /bwfs/scratch/temphomes/hd/f81/prog/ga-5-0-2 -var LIBS="-libverbs"
>>>>>>> -blaspath /opt/bwgrid/compiler/intel/ct_4.0/mkl/10.2.5.035/lib/em64t
>>>>>>>
>>>>>>> molpro configure crashes then after the following:
>>>>>>>
>>>>>>> ga_GA_MP_LIBS     = -lmpi -lmpigf -lmpigi -lpthread
>>>>>>> ga_GA_MP_LDFLAGS  =
>>>>>>> -L/opt/bwgrid/compiler/intel/ct_4.0/impi/4.0.0.028/intel64/lib
>>>>>>> ga_GA_MP_CPPFLAGS =
>>>>>>> -I/opt/bwgrid/compiler/intel/ct_4.0/impi/4.0.0.028/intel64/include
>>>>>>> ga_TARGET         = LINUX64
>>>>>>> ga_MSG_COMMS      = TCGMSGMPI
>>>>>>> ga_prefix         = /bwfs/scratch/temphomes/hd/f81/ga-5-0-2-install
>>>>>>>
>>>>>>> MPPLIB=
>>>>>>> parallel build specified but unable to find Global Arrays
>>>>>> It is trying to find the generated GA lib files in directory
>>>>>> /path/ga-5-0-2-install/lib/
>>>>>>
>>>>>> A typical examples:
>>>>>> ls  /mypatch/ga-5-0-2-install/lib/
>>>>>> libarmci.a  libarmci.la  libga.a  libga.la
>>>>>>
>>>>>> Best wishes,
>>>>>> Manhui
>>>>>>
>>>>>>> Result so far of configure written to CONFIG.errout
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Apperently molpro configure finds the config.log in the ga-5-0-2
>>>>>>> directory. But what it needs then I don't understand. What is
>>>>>>> this MPPLIB ?
>>>>>>>
>>>>>>> Best,
>>>>>>> Evgeniy
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Manhui Wang wrote:
>>>>>>>> Dear Evgeniy,
>>>>>>>>
>>>>>>>> Could you please check your GA is working properly?
>>>>>>>> Could you please provide more details about how do you run 
>>>>>>>> configure in
>>>>>>>> a fresh Molpro2010.1 PL20?
>>>>>>>>
>>>>>>>> Best wishes,
>>>>>>>> Manhui
>>>>>>>>
>>>>>>>> Evgeniy Gromov wrote:
>>>>>>>>> Dear Manhui,
>>>>>>>>>
>>>>>>>>> If I specify the original GA directory configure cannot find some
>>>>>>>>> libraries whatever and crashes:
>>>>>>>>>
>>>>>>>>> ga_GA_MP_LIBS     = -lmpi -lmpigf -lmpigi -lpthread
>>>>>>>>> ga_GA_MP_LDFLAGS  =
>>>>>>>>> -L/opt/bwgrid/compiler/intel/ct_4.0/impi/4.0.0.028/intel64/lib
>>>>>>>>> ga_GA_MP_CPPFLAGS =
>>>>>>>>> -I/opt/bwgrid/compiler/intel/ct_4.0/impi/4.0.0.028/intel64/include
>>>>>>>>> ga_TARGET         = LINUX64
>>>>>>>>> ga_MSG_COMMS      = TCGMSGMPI
>>>>>>>>> ga_prefix         = 
>>>>>>>>> /bwfs/scratch/temphomes/hd/f81/ga-5-0-2-install
>>>>>>>>>
>>>>>>>>> MPPLIB=
>>>>>>>>> parallel build specified but unable to find Global Arrays
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> It looks like the structure of the GA directory has been changed
>>>>>>>>> in the latest release. At least there is neither include no lib
>>>>>>>>> directories in the GA directory. Should I take some older GA 
>>>>>>>>> realese
>>>>>>>>> then? Which one?
>>>>>>>>>
>>>>>>>>> Best,
>>>>>>>>> Evgeniy
>>>>>>>>>
>>>>>>>>> Manhui Wang wrote:
>>>>>>>>>> Dear Evgeniy,
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> It appears your CONFIG is not right. As I mentioned in previous
>>>>>>>>>> email:
>>>>>>>>>>
>>>>>>>>>> ./configure -batch  -icc -ifort -blaspath
>>>>>>>>>> /software/intel/mkl/10.2.5.035/lib/em64t -mpp -mppbase ${GA_path}
>>>>>>>>>> -var
>>>>>>>>>> LIBS="-libverbs"
>>>>>>>>>>
>>>>>>>>>> Please note ${GA_path} is not the GA installation directory, 
>>>>>>>>>> but the
>>>>>>>>>> original GA directory where you type"configure...make". Molpro
>>>>>>>>>> configure
>>>>>>>>>> takes some info from ${GA_path} to determine the GA mode and the
>>>>>>>>>> underlying MPI library etc.
>>>>>>>>>> -var LIBS="-libverbs" will enable IntelMPI to be linked with
>>>>>>>>>> Infiniband
>>>>>>>>>> library if your IntelMPI on the machine can't do this 
>>>>>>>>>> automatically.
>>>>>>>>>> Once the CONFIG is generated, we don't recommend to modify it
>>>>>>>>>> manually
>>>>>>>>>> since it may introduce unexpected problems.
>>>>>>>>>>
>>>>>>>>>> Best wishes,
>>>>>>>>>> Manhui
>>>>>>>>>>
>>>>>>>>>> Evgeniy Gromov wrote:
>>>>>>>>>>> Dear Manhui,
>>>>>>>>>>>
>>>>>>>>>>> Thanks for your response.
>>>>>>>>>>> There is unfortunately no progress with GA. I compiled molpro
>>>>>>>>>>> but the binary doesn't work at all, although the compilation
>>>>>>>>>>> went OK. There is somehow again no ABDATA in lib, but it doesn't
>>>>>>>>>>> complain to that. What it complains looks strange:
>>>>>>>>>>>
>>>>>>>>>>> Attempting to use an MPI routine before initializing MPICH
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Well I used intel MPI and I don't understand why it refers to 
>>>>>>>>>>> MPICH.
>>>>>>>>>>>
>>>>>>>>>>> Just in case my CONFIG looks the following:
>>>>>>>>>>>
>>>>>>>>>>> # MOLPRO CONFIG generated at Di 5. Apr 10:21:18 CEST 2011, 
>>>>>>>>>>> for host
>>>>>>>>>>> frbw3
>>>>>>>>>>>
>>>>>>>>>>> CONFIGURE_OPTIONS="-i8" "-openmp" "-icc" "-ifort" "-nohdf5"
>>>>>>>>>>> "-nocuda"
>>>>>>>>>>> "-mpp" "-mppbase" 
>>>>>>>>>>> "/bwfs/scratch/temphomes/hd/f81/ga-5-0-2-install"
>>>>>>>>>>> "-blaspath"
>>>>>>>>>>> "/opt/bwgrid/compiler/intel/ct_4.0/mkl/10.2.5.035/lib/em64t"
>>>>>>>>>>>
>>>>>>>>>>> AR=ar
>>>>>>>>>>> ARCHNAME=Linux/x86_64
>>>>>>>>>>> ARFLAGS=-rS
>>>>>>>>>>> AWK=awk
>>>>>>>>>>> BIBTEX=/usr/bin/bibtex
>>>>>>>>>>> BLASLIB=-L/opt/bwgrid/compiler/intel/ct_4.0/mkl/10.2.5.035/lib/em64t 
>>>>>>>>>>>
>>>>>>>>>>> -lmkl_intel_ilp64 -lmkl_sequential -lmkl_core
>>>>>>>>>>> BUILD=p
>>>>>>>>>>> CAT=cat
>>>>>>>>>>> CC=/opt/bwgrid/compiler/intel/ct_4.0/Compiler/11.1/072/bin/intel64/icc 
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> CCVERSION=11.1
>>>>>>>>>>> CC_FRONT=
>>>>>>>>>>> CDEBUG=-g $(addprefix $(CDEFINE),_DEBUG)
>>>>>>>>>>> CDEFINE=-D
>>>>>>>>>>> CFLAGS=-ftz -vec-report0 -DINT64 -DZLIB -openmp
>>>>>>>>>>> CLEARSPEEDLIB=
>>>>>>>>>>> CMPPINCLUDE=/bwfs/scratch/temphomes/hd/f81/ga-5-0-2-install/include 
>>>>>>>>>>>
>>>>>>>>>>> COPT=-O2
>>>>>>>>>>> COPT0=-O0
>>>>>>>>>>> COPT1=-O1
>>>>>>>>>>> COPT2=-O2
>>>>>>>>>>> COPT3=-O3
>>>>>>>>>>> CP=cp -p
>>>>>>>>>>> CPROFILE=-p
>>>>>>>>>>> CSCN=cscn
>>>>>>>>>>> CSFLAGS=-O3 -I. --dynamic
>>>>>>>>>>> CUDACC=
>>>>>>>>>>> CUDACCVERSION=
>>>>>>>>>>> CUDACDEBUG=-g $(addprefix $(CUDACDEFINE),_DEBUG)
>>>>>>>>>>> CUDACDEFINE=-D
>>>>>>>>>>> CUDACFLAGS=
>>>>>>>>>>> CUDACOPT=
>>>>>>>>>>> CUDACOPT0=-O0
>>>>>>>>>>> CUDACOPT1=-O1
>>>>>>>>>>> CUDACOPT2=-O2
>>>>>>>>>>> CUDACOPT3=-O3
>>>>>>>>>>> CUDACPROFILE=-p
>>>>>>>>>>> CXX=/opt/bwgrid/compiler/intel/ct_4.0/Compiler/11.1/072/bin/intel64/icpc 
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> CXXFLAGS=$(CFLAGS)
>>>>>>>>>>> DOXYGEN=/usr/bin/doxygen
>>>>>>>>>>> ECHO=echo
>>>>>>>>>>> EXPORT=export
>>>>>>>>>>> F90FLAGS=
>>>>>>>>>>> FC=/opt/bwgrid/compiler/intel/ct_4.0/Compiler/11.1/072/bin/intel64/ifort 
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> FCVERSION=11.1
>>>>>>>>>>> FDEBUG=-g $(addprefix $(FDEFINE),_DEBUG)
>>>>>>>>>>> FDEFINE=-D
>>>>>>>>>>> FFLAGS=-i8 -r8 -pc64 -auto -Vaxlib -vec-report0 -cxxlib -openmp
>>>>>>>>>>> FOPT=-O3
>>>>>>>>>>> FOPT0=-O0
>>>>>>>>>>> FOPT1=-O1
>>>>>>>>>>> FOPT2=-O2
>>>>>>>>>>> FOPT3=-O3
>>>>>>>>>>> FPP=-fpp
>>>>>>>>>>> FPROFILE=-p
>>>>>>>>>>> FSTATIC=
>>>>>>>>>>> FTCFLAGS=molpro unix unix-i8 Linux lapack mpp blas
>>>>>>>>>>> HDF5INCLDE=
>>>>>>>>>>> HDF5LIB=
>>>>>>>>>>> HOSTFILE_FORMAT=
>>>>>>>>>>> INSTBIN=/bwfs/scratch/temphomes/hd/f81/molpro2010.1-install/bin
>>>>>>>>>>> INSTHTML=/bwfs/scratch/temphomes/hd/f81/molpro2010.1-install/html 
>>>>>>>>>>>
>>>>>>>>>>> INSTLIB=/bwfs/scratch/temphomes/hd/f81/molpro2010.1-install/lib/molprop_2010_1_Linux_x86_64_i8 
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> INST_PL=0
>>>>>>>>>>> INTEGER=8
>>>>>>>>>>> LAPACKLIB=
>>>>>>>>>>> LATEX2HTML=
>>>>>>>>>>> LAUNCHER=/opt/bwgrid/mpi/mpiexec/0.84/bin/mpiexec -n %n %x
>>>>>>>>>>> LD_ENV=/opt/bwgrid/compiler/intel/ct_4.0/Compiler/11.1/072/lib/intel64:/opt/bwgrid/compiler/intel/ct_4.0/mkl/10.2.5.035/lib/em64t 
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> LD_ENVNAME=LD_LIBRARY_PATH
>>>>>>>>>>> LIBRARY_SUFFIX=a
>>>>>>>>>>> LIBS=-lz
>>>>>>>>>>> LIBS_FRONT=
>>>>>>>>>>> LINKOPT=
>>>>>>>>>>> LINKOPT_FRONT=
>>>>>>>>>>> LN=ln -s
>>>>>>>>>>> MACROS=MOLPRO MOLPRO_ifort MOLPRO_f2003 GA_TOOLS GA_ 
>>>>>>>>>>> GA_VERSION_GE_5
>>>>>>>>>>> _I8_ BLAS_INT=8 LAPACK_INT=8 MOLPRO_FORCE_VECTOR 
>>>>>>>>>>> MOLPRO_NEXTSCALAR
>>>>>>>>>>> MOLPRO_NO_RECURRENCE MOLPRO_NOVECTOR MOLPRO_SHORTLOOP 
>>>>>>>>>>> _MOLCAS_MPP_
>>>>>>>>>>> MAKEDEPEND_OPTIONS=
>>>>>>>>>>> MAKEINDEX=/usr/bin/makeindex
>>>>>>>>>>> MAPLE=
>>>>>>>>>>> MKDIR=mkdir -p
>>>>>>>>>>> MODULE_FLAG=-I
>>>>>>>>>>> MODULE_SUFFIX=mod
>>>>>>>>>>> MPILIB=-L/opt/bwgrid/compiler/intel/ct_4.0/impi/4.0.0.028/intel64/lib 
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> -lmpi -lmpigf -lmpigi -lpthread -L/usr/lib64 -libverbs 
>>>>>>>>>>> -libcommon
>>>>>>>>>>> -libumad -lpthread
>>>>>>>>>>> MPPLIB=-L/bwfs/scratch/temphomes/hd/f81/ga-5-0-2-install/lib/LINUX64 
>>>>>>>>>>>
>>>>>>>>>>> -lga -larmci
>>>>>>>>>>> OBJECT_SUFFIX=o
>>>>>>>>>>> OPT0=copyc6.f
>>>>>>>>>>> OPT1=nevpt2_optrpc.f explicit_util.f artwo2.f 
>>>>>>>>>>> drv2el_l3ext_lmp2g.f
>>>>>>>>>>> drv2el_l3ext_cen.f rmp2_f12_drv2.f90 ri_lmp2g.f df_llmp2.f
>>>>>>>>>>> readdump.f
>>>>>>>>>>> OPT2=integrals.f90 RL4gen1.f basis_integral_shells.f
>>>>>>>>>>> OPT3=
>>>>>>>>>>> PAPER=a4paper
>>>>>>>>>>> PARSE=parse-Linux-x86_64-i8.o
>>>>>>>>>>> PDFLATEX=/usr/bin/pdflatex -halt-on-error
>>>>>>>>>>> PNAME=molprop_2010_1_Linux_x86_64_i8
>>>>>>>>>>> PTSIZE=11
>>>>>>>>>>> RANLIB=ranlib
>>>>>>>>>>> RM=rm -rf
>>>>>>>>>>> SHELL=/bin/sh
>>>>>>>>>>> STRIP=strip
>>>>>>>>>>> SUFFIXES=f F f90 F90 c cpp
>>>>>>>>>>> TAR=tar -cf
>>>>>>>>>>> UNTAR=tar -xf
>>>>>>>>>>> VERBOSE=@
>>>>>>>>>>> VERSION=2010.1XSD=/usr/bin/xmllint --noout --schema
>>>>>>>>>>> XSLT=/usr/bin/xsltproc
>>>>>>>>>>> YACC=bison -b y
>>>>>>>>>>>
>>>>>>>>>>> .SUFFIXES:
>>>>>>>>>>> MAKEFLAGS+=-r
>>>>>>>>>>> ifneq ($(LD_ENVNAME),)
>>>>>>>>>>> $(LD_ENVNAME):=$(LD_ENV):$($(LD_ENVNAME))
>>>>>>>>>>> endif
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Best regards,
>>>>>>>>>>> Evgeniy
>>>>>>>>>>>
>>>>>>>>>>> Manhui Wang wrote:
>>>>>>>>>>>> Dear Evgeniy,
>>>>>>>>>>>>
>>>>>>>>>>>> I have not checked your output carefully, which shows 800MWord
>>>>>>>>>>>> (6.4GB
>>>>>>>>>>>> per process):
>>>>>>>>>>>>
>>>>>>>>>>>> Variable memory set to  800000000 words,  buffer space   230000
>>>>>>>>>>>> words
>>>>>>>>>>>>
>>>>>>>>>>>> I run the job  with 32 processes on 4 nodes, and it might crash
>>>>>>>>>>>> eventually due to lack of memory you mentioned.
>>>>>>>>>>>>
>>>>>>>>>>>> Could you please update your progress with GA? On my side, I 
>>>>>>>>>>>> will
>>>>>>>>>>>> look
>>>>>>>>>>>> into my code related to helper. It might exist a bug which is
>>>>>>>>>>>> exposed
>>>>>>>>>>>> when very large global data structure is used.
>>>>>>>>>>>>
>>>>>>>>>>>> Best wishes,
>>>>>>>>>>>> Manhui
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Evgeniy Gromov wrote:
>>>>>>>>>>>>> Dear Manhui,
>>>>>>>>>>>>>
>>>>>>>>>>>>> Many thanks for your help with the compilation problems and
>>>>>>>>>>>>> with the problem in the large eom-ccsd calculation.
>>>>>>>>>>>>> Sorry I forgot to tell you about the memory. I usually specify
>>>>>>>>>>>>> the memory in the command line using -m option. For that large
>>>>>>>>>>>>> eom-ccsd job I used 800m (-m800m) because 200m was not 
>>>>>>>>>>>>> enough for
>>>>>>>>>>>>> the CCSD step. As I wrote the job passed the CCSD and crashed
>>>>>>>>>>>>> in the beginning of the EOM-CCSD. Actually, are you running it
>>>>>>>>>>>>> on 1 CPU? I haven't tried it to run on 1 CPU as it will likely
>>>>>>>>>>>>> take ages cause many MOs. Well I am going to rerun it using
>>>>>>>>>>>>> a new binary compiled with GA.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Best wishes,
>>>>>>>>>>>>> Evgeniy
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> Manhui Wang wrote:
>>>>>>>>>>>>>> Dear Evgeniy,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> I can't see the error you mentioned with MPI-2 Molpro2010.1
>>>>>>>>>>>>>> PL20 so
>>>>>>>>>>>>>> far.
>>>>>>>>>>>>>> I tried your input without any modification first, and it 
>>>>>>>>>>>>>> failed
>>>>>>>>>>>>>> immediately because you have not set memory in input (the
>>>>>>>>>>>>>> default is
>>>>>>>>>>>>>> only 8MWord). I have added
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> memory,200,m
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> in input, it appears to work fine so far (still running).  
>>>>>>>>>>>>>> Could
>>>>>>>>>>>>>> you
>>>>>>>>>>>>>> please try the new input on your side?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Best wishes,
>>>>>>>>>>>>>> Manhui
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Evgeniy Gromov wrote:
>>>>>>>>>>>>>>> Manhui,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I attached wrong files in the previous email. Here are the
>>>>>>>>>>>>>>> correct
>>>>>>>>>>>>>>> input and output files. In addition I removed all the MO
>>>>>>>>>>>>>>> vectors in
>>>>>>>>>>>>>>> the output to make the size smaller.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Best regards,
>>>>>>>>>>>>>>> Evgeniy
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Manhui Wang wrote:
>>>>>>>>>>>>>>>> Dear Evgeniy,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Could you please provide the inputs/outputs? I will see 
>>>>>>>>>>>>>>>> whether
>>>>>>>>>>>>>>>> I can
>>>>>>>>>>>>>>>> reproduce the problem firstly.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>>>>> Manhui
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Evgeniy Gromov wrote:
>>>>>>>>>>>>>>>>> Dear Molpro Community,
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> I faced problems when running a large eom-ccsd calculation
>>>>>>>>>>>>>>>>> (466
>>>>>>>>>>>>>>>>> MOs,
>>>>>>>>>>>>>>>>> no symmetry) with molpro2010.1 in parallel (MPI-2). The
>>>>>>>>>>>>>>>>> calculation
>>>>>>>>>>>>>>>>> crashed after the CCSD (ground state) step had been 
>>>>>>>>>>>>>>>>> done. No
>>>>>>>>>>>>>>>>> error
>>>>>>>>>>>>>>>>> message was present in the output. In the err file 
>>>>>>>>>>>>>>>>> there is
>>>>>>>>>>>>>>>>> the
>>>>>>>>>>>>>>>>> following diagnostic (though unclear to me):
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> ERROR in twosided_helpga_locate_server: over range!
>>>>>>>>>>>>>>>>> ilo=1,ihigh=148114575
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Any idea what can be the problem.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Well, it seems to relate to helper process. I suspect 
>>>>>>>>>>>>>>>>> that in
>>>>>>>>>>>>>>>>> the
>>>>>>>>>>>>>>>>> case
>>>>>>>>>>>>>>>>> of GA it would work, although there might be some other
>>>>>>>>>>>>>>>>> problems.
>>>>>>>>>>>>>>>>> Just in case the same run with 296 MO (another basis 
>>>>>>>>>>>>>>>>> set) ran
>>>>>>>>>>>>>>>>> fine.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Also, I would like to thank Manhui for the response to my
>>>>>>>>>>>>>>>>> previous
>>>>>>>>>>>>>>>>> post:
>>>>>>>>>>>>>>>>> "Is there always a helper process in parallel calculations
>>>>>>>>>>>>>>>>> with
>>>>>>>>>>>>>>>>> molpro2010.1 ?"
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Best regards,
>>>>>>>>>>>>>>>>> Evgeniy
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>
>>
>>
>> -- 
>> _______________________________________
>> Dr. Evgeniy Gromov
>> Theoretische Chemie
>> Physikalisch-Chemisches Institut
>> Im Neuenheimer Feld 229
>> D-69120 Heidelberg
>> Germany
>>
>> Telefon: +49/(0)6221/545263
>> Fax: +49/(0)6221/545221
>> E-mail: evgeniy at pci.uni-heidelberg.de
>> _______________________________________
>>
>>
>>
>>
> 
> Dr. Tatiana Korona http://tiger.chem.uw.edu.pl/staff/tania/index.html
> Quantum Chemistry Laboratory
> University of Warsaw
> Pasteura 1, PL-02-093 Warsaw, POLAND
> 
> 
> `The man who makes no mistakes does not usually make anything.'
>                                        Edward John Phelps (1822-1900)
> 


-- 
_______________________________________
Dr. Evgeniy Gromov
Theoretische Chemie
Physikalisch-Chemisches Institut
Im Neuenheimer Feld 229
D-69120 Heidelberg
Germany

Telefon: +49/(0)6221/545263
Fax: +49/(0)6221/545221
E-mail: evgeniy at pci.uni-heidelberg.de
_______________________________________






More information about the Molpro-user mailing list