[molpro-user] 2012.1 build with latest ifort

Andy May MayAJ1 at cardiff.ac.uk
Wed Dec 5 10:49:42 GMT 2012


Kirk,

Molpro/src/mrci> for i in *.F*; do grep "\<$i\>" ../../configure ; done

will give you a list of 'usual suspect' files which on occasion need to 
be deoptimized. But it is perfectly possible that some other file, 
perhaps outside of mrci directory, could be responsible.

Best wishes,

Andy

On 29/11/12 16:30, Kirk Peterson wrote:
> Andy,
>
> Actually since yesterday I've seen another problem that I haven't tracked down yet. This happens sometimes in MRCI, where I get errors like:
>
>    ITER. STATE  ROOT     SQ.NORM     CORR.ENERGY   TOTAL ENERGY   ENERGY CHANGE       DEN1      VAR(S)    VAR(P)      TIME
>      1     1     1     1.00000000     0.00000000  -472.84329554     0.00000000    -0.46195997  0.13D-01  0.17D+00     0.12
>      1     2     2     1.00000000     0.00000000  -472.84329554     0.00000000    -0.46189591  0.13D-01  0.17D+00     0.12
>      1     3     3     1.00000000     0.00000000  -472.84329554    -0.00000000    -0.46211488  0.13D-01  0.17D+00     0.12
>    NON ZERO P-SP. GR.                     1  2.307130755951903E-005
>     19.2205640059808       -51.3471200527126      -0.374325588561569
>    NON ZERO P-SP. GR.                     2 -2.686130349260907E-007
>     5.21678081133428       -51.3471200527126      -0.101598318943532
>    NON ZERO P-SP. GR.                     4  2.089754211453965E-006
>     6.99646184593371       -51.3471200527126      -0.136258075409039
>    NON ZERO P-SP. GR.                     5 -6.320252897040746E-006
> ....
> ....
>   Internal expansion vectors linearly dependent.
>   Smallest eigenvalue of S: -0.3376972    . Vector removed
>   Internal expansion vectors linearly dependent.
>   Smallest eigenvalue of S: -0.2122623    . Vector removed
>   Internal expansion vectors linearly dependent.
>   Smallest eigenvalue of S: -0.1276288E-01. Vector removed
>   ? Error
>   ? No eigenvector overlaps with reference state 1
>
>
> The exact same input works fine with no problems with a molpro version built on a nehalem processor rather than sandy bridge using ifort v13 .  Any idea which routine I should be looking at?
>
> best wishes,
>
> Kirk
>
>
> On Nov 29, 2012, at 4:27 AM, Andy May <MayAJ1 at Cardiff.ac.uk> wrote:
>
>> Some more general info:
>>
>> There is also a bug open about this problem:
>>
>> https://www.molpro.net/bugzilla/show_bug.cgi?id=3916
>>
>> which most likely your fix of deoptimizing muwvfn.F will fix.
>>
>> We don't have yet the Intel 13 compiler, but will hopefully get this shortly. The list of compilers that Molpro is currently tested with can be found at:
>>
>> http://www.molpro.net/supported/
>>
>> Best wishes,
>>
>> Andy
>>
>> On 12/11/12 06:50, Kirk Peterson wrote:
>>> Just to complete this thread, this is now fixed.  There must have been
>>> some sort of inconsistency in my library paths or something, since at
>>> some point the MRCI problem below disappeared.  For the record, I also
>>> miswrote below, one  needs to compile  multi/muwvfn.F with OPT1 (and not
>>> muaug.F).
>>>
>>> best,
>>>
>>> -Kirk
>>>
>>> On Nov 10, 2012, at 1:25 PM, Kirk Peterson <kipeters at wsu.edu
>>> <mailto:kipeters at wsu.edu>> wrote:
>>>
>>>> As an update to this, the problem with multi was easy to fix, just
>>>> compile muaug.F with -O2.  The real problem is in the MRCI where it
>>>> hangs for a long time and eventually I get a bunch of lines like:
>>>>
>>>> Number of p-space configurations:   1
>>>>   NON ZERO P-SP. GR.                     4  1.077040023123477E-003
>>>>    1.37714225160409       -25.2064501018170      -5.459178924531589E-002
>>>>   NON ZERO P-SP. GR.                     4  6.186392658566131E-003
>>>>   0.624789661427233       -29.6917480361564      -2.083418153809529E-002
>>>>   NON ZERO P-SP. GR.                     4  6.939620540519886E-003
>>>>    1.18397024064320       -39.0969508512373      -3.010543263543086E-002
>>>>   NON ZERO P-SP. GR.                     4  6.674144365430790E-003
>>>>    1.53773711679753       -59.8632618117633      -2.557600314607719E-002
>>>>   NON ZERO P-SP. GR.                     4  7.027066282512706E-003
>>>>    2.43885843805420       -104.128402980249      -2.335415988501181E-002
>>>>   NON ZERO P-SP. GR.                     4  3.631445207469497E-003
>>>>    5.56363239276109       -440.472955145745      -1.262279756929438E-002
>>>>   NON ZERO P-SP. GR.                     4  1.489914741162224E-003
>>>>    12.7352403036245       -2004.81756495344      -6.351575630363697E-003
>>>>   NON ZERO P-SP. GR.                     4  5.507653598897377E-004
>>>>    32.3847976001225       -13037.8013700382      -2.483873309282336E-003
>>>>   NON ZERO P-SP. GR.                     4  1.213631637853041E-004
>>>>    84.7182220590193       -90053.2547920299      -9.407555661536559E-004
>>>>
>>>> everything else in h2o_vdz.test runs fine up until then.  The odd
>>>> thing is this only happens in parallel (same build, just more than 1
>>>> core requested) and only on my new Intel Sandy Bridge processors
>>>> (works fine on slightly older Nehalem procs).  I've tried ifort
>>>> v13.0.1 and also ifort v12.1.6  with the same result.  This parallel
>>>> build uses auto-ga-mpich2 .
>>>>
>>>> a older build with ifort v11.1 works fine, but unfortunately I can't
>>>> use this much since my Centos 6.0 upgrade has broken this compiler.
>>>>
>>>> any hints or suggestions?
>>>>
>>>> -Kirk
>>>>
>>>> PS - in case it's not clear, this is a linux build.
>>>>
>>>>
>>>>>
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> has anyone successfully built molpro2012.1 with the latest ifort,
>>>>>> i.e., v13.0.1 ?  It seems to compile and link fine, but when running
>>>>>> h2o_vdz.test it dies in multi with a segmentation fault:
>>>>>>
>>>>>> ITER. MIC  NCI  NEG     ENERGY(VAR)     ENERGY(PROJ)   ENERGY CHANGE
>>>>>>     GRAD(0)  GRAD(ORB)   GRAD(CI)     STEP       TIME
>>>>>>
>>>>>>
>>>>>> GLOBAL ERROR fehler on processor   0
>>>>>>
>>>>>>
>>>>>> I've built a mpp version (ga-mpich2) linked to the mkl library using
>>>>>> gcc and g++ with everything else at their default settings.
>>>>>>
>>>>>> best,
>>>>>>
>>>>>> -Kirk
>>>>>> _______________________________________________
>>>>>> Molpro-user mailing list
>>>>>> Molpro-user at molpro.net <mailto:Molpro-user at molpro.net>
>>>>>> http://www.molpro.net/mailman/listinfo/molpro-user
>>>>>
>>>>
>>>> _______________________________________________
>>>> Molpro-user mailing list
>>>> Molpro-user at molpro.net <mailto:Molpro-user at molpro.net>
>>>> http://www.molpro.net/mailman/listinfo/molpro-user
>>>
>>>
>>>
>>> _______________________________________________
>>> Molpro-user mailing list
>>> Molpro-user at molpro.net
>>> http://www.molpro.net/mailman/listinfo/molpro-user
>>>
>



More information about the Molpro-user mailing list