[molpro-user] MRCI on Xeon E5-2650 (MPI parallel)

Grant Hill Grant.Hill at glasgow.ac.uk
Fri Feb 15 12:20:32 GMT 2013


As a quick follow-up:

I've compiled with -noblas -nolapack and it seems to have fixed the problem, which suggests it's an issue with mkl. If anyone has any ideas on how to massage mkl into behaving itself I'd be willing to try them.


Grant



On 15 Feb 2013, at 10:36, Grant Hill <grant.hill at glasgow.ac.uk> wrote:

> Dear All,
> 
> I'm attempting to build Molpro 2012.1 (both the release version and several different nightlies) on a Xeon E5-2650 based machine (using ifort 13.0.1 and gcc 4.4.6). O/S is Scientific Linux 6.3 and a CONFIG is attached. Everything works fine in serial but when I attempt MPI parallel with MRCI I get output such as:
> 
> ----------%<----------------------------------------------
> 
> Molecular orbitals read from record     2141.2  Type=MCSCF/NATURAL (state 1.1)
> 
> Coulomb and exchange operators available. No transformation done.
> 
> Number of p-space configurations:   1
>  NON ZERO P-SP. GR.                     1  5.411188528370303E-006
>  -23.4872221976665       -23.8397063438111       0.985214636041539     
>  NON ZERO P-SP. GR.                     1  3.054201909316223E-002
>  -3.02340512645286       -6297.79613380019       4.849231509980970E-004
>  NON ZERO P-SP. GR.                     1  1.219425666945995E-002
>  -19.8292244309273       -281893.555225716       7.038620897774520E-005
>  NON ZERO P-SP. GR.                     1  3.859985552395528E-003
>  -45.9002451299124       -1527517.61471912       3.005144076450190E-005
> 
> ---------->%----------------------------------------------
> 
> And not a lot happens after that (a few similar lines are printed, then the job seems to hang).
> 
> This can easily be seen with something such as `make MOLPRO_OPTIONS=-n2 quicktest` as the first testjob displays this behaviour. I've tried mpich3, mpich2 and openmpi via the autobuild option (along with ga). Has anyone encountered this before, or have any suggestions of what to try? I'm tearing my hair out at this point.
> 
> Grant
> 
> 
> P.S. I found this old message about buggy mkl in the archive, but I don't know how to translate the workaround to the new build system: http://www.molpro.net/pipermail/molpro-user/2006-April/001764.html
> 
> <CONFIG>
> 
> 
> 
> 
> _______________________________________________
> Molpro-user mailing list
> Molpro-user at molpro.net
> http://www.molpro.net/mailman/listinfo/molpro-user




More information about the Molpro-user mailing list