[molpro-user] parallel casscf jobs?

Alan Chien alandc at umich.edu
Tue May 13 13:29:57 BST 2014


Hey all,

I'm trying to benchmark parallel casscf jobs for a system to see how much
speed-up we can get.
However, at higher numbers of cores, the jobs seem to massively slowdown at
the mcscf program. And by massive I mean my 10 core job finished in 2 hrs,
my 20 and 30 cores are still running at 15+ hrs.

I checked that I'm actually running in parallel in the higher core jobs by
going into the node and using the "top" command and seeing the requisite
amount of processes. I also see speedups for the hf portion of the jobs,
which is another good sign. Thus parallel seems to work, and so I imagine
the problem is with the mcscf program?

So I'm wondering if anyone knows something about the parallelism in the
mcscf program that I don't?

Thanks,
Alan Chien

Ps - below is my input.

***,benching
memory,400,m
file,2,molpro_test
symmetry,nosym
orient,noorient
geometry={
  S -0.162524 0.053295 0.495328
  C 1.156761 -0.091953 1.664961
  C 0.618562 -0.184863 3.010711
  C -0.723646 -0.138987 3.071693
  C -1.358298 -0.002707 1.770142
  C 2.458808 -0.108774 1.302794
  C 2.997006 -0.015865 -0.042956
  C 4.339216 -0.061679 -0.103934
  C 4.973871 -0.197927 1.197620
  S 3.778092 -0.254037 2.472425
  C 6.314742 -0.273041 1.427851
  C 7.238480 -0.216910 0.334867
  N 7.954320 -0.165830 -0.574719
  C -2.699166 0.072485 1.539915
  C -3.622902 0.016436 2.632906
  N -4.338739 -0.034586 3.542497
  C -3.210214 0.203397 0.209889
  N -3.572921 0.311482 -0.885345
  C 6.825788 -0.403943 2.757879
  N 7.188473 -0.512016 3.853122
  H 4.930606 -0.003593 -1.011489
  H 2.349214 0.087358 -0.908262
  H 1.266353 -0.288099 3.876016
  H -1.315035 -0.197030 3.979251
}

basis=6-31G

start,2100.2 !read orbitals
! direct
hf
{casscf
closed,72 !core orbs
occ,76    !active orbs
wf,148,1,0 !nelectrons,symm,singlet
state,3   !nstates
CPMCSCF,GRAD,3.1
ciguess,2501.2
save,ci=2501.2
orbital,2100.2 !write orbitals
pspace,100.0
}

{forces; varsav}
___
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.molpro.net/pipermail/molpro-user/attachments/20140513/831e9600/attachment.html>


More information about the Molpro-user mailing list