- Posts: 7
- Thank you received: 0
If you have problems during the execution of MRCC, please attach the output with an adequate description of your case as well as the followings:
This information really helps us during troubleshooting
- the way mrcc was invoked
- the way build.mrcc was invoked
- the output of build.mrcc
- compiler version (for example: ifort -V, gfortran -v)
- blas/lapack versions
- as well as gcc and glibc versions
This information really helps us during troubleshooting
Problems running MPI version of MRCC after MOLPRO
- valerie_vallet
- Topic Author
- Offline
- New Member
Less
More
10 years 2 months ago #30
by valerie_vallet
Problems running MPI version of MRCC after MOLPRO was created by valerie_vallet
I would like to run the MPI version of MRCC as a standalone program after MOLPRO.
I have compiled MRCC with
build.mrcc Intel -pMPI
I have then tried two things
1/ to copy the MPI-compiled mrcc binaries to the sequential MOLPRO bin
the molpro execution fails in the MRCC part with the error
Fatal error in goldstone.
? Error
? Fatal error in goldstone.
? The problem occurs in MRCC
MPI parallel version is running.
Number of CPUs: 8
2/ to put the MPI-compiled mrcc binaries to the parallel/MPI MOLPRO bin
There I get the following error message:
Module MRCC not available
According to Mihaly, I should get at some point the following message:
"Now launch mrcc in parallel mode!", but I don't.
Has anyone tried to get MOLPRO and MRCC run smoothly in parallel?
Thanks for your help.
I attach my input file
I have compiled MRCC with
build.mrcc Intel -pMPI
I have then tried two things
1/ to copy the MPI-compiled mrcc binaries to the sequential MOLPRO bin
the molpro execution fails in the MRCC part with the error
Fatal error in goldstone.
? Error
? Fatal error in goldstone.
? The problem occurs in MRCC
MPI parallel version is running.
Number of CPUs: 8
2/ to put the MPI-compiled mrcc binaries to the parallel/MPI MOLPRO bin
There I get the following error message:
Module MRCC not available
According to Mihaly, I should get at some point the following message:
"Now launch mrcc in parallel mode!", but I don't.
Has anyone tried to get MOLPRO and MRCC run smoothly in parallel?
Thanks for your help.
I attach my input file
Please Log in or Create an account to join the conversation.
- kallay
- Offline
- Administrator
- Mihaly Kallay
10 years 2 months ago #31
by kallay
Best regards,
Mihaly Kallay
Replied by kallay on topic Problems running MPI version of MRCC after MOLPRO
I suggest the following quicky solution.
Start molpro with the sequential mrcc binaries, that is, with the ones you created with the molpro makefile. Kill the job when mrcc (not the dmrcc driver) starts running. Then you will find the files required for the parallel run in the "mrcc_testdir" directory. Then you should follow the instruction in Sect. 9.3 of the manual, but using the MPI-compiled mrcc binary, that is, the one you obtained with the build.mrcc script. In brief:
1) copy fort.1* and fort.5* from the mrcc_testdir directory to the compute nodes
2) execute mrcc using mpirun (where mrcc is the MPI-compiled binary).
Start molpro with the sequential mrcc binaries, that is, with the ones you created with the molpro makefile. Kill the job when mrcc (not the dmrcc driver) starts running. Then you will find the files required for the parallel run in the "mrcc_testdir" directory. Then you should follow the instruction in Sect. 9.3 of the manual, but using the MPI-compiled mrcc binary, that is, the one you obtained with the build.mrcc script. In brief:
1) copy fort.1* and fort.5* from the mrcc_testdir directory to the compute nodes
2) execute mrcc using mpirun (where mrcc is the MPI-compiled binary).
Best regards,
Mihaly Kallay
Please Log in or Create an account to join the conversation.
- valerie_vallet
- Topic Author
- Offline
- New Member
Less
More
- Posts: 7
- Thank you received: 0
10 years 2 days ago #66
by valerie_vallet
Replied by valerie_vallet on topic Problems running MPI version of MRCC after MOLPRO
Hi Mihály,
I am getting back to you concerning MOLPRO and the MPI version of MRCC. Is there a "clean" way to ask MOLPRO to stop execution, so that one can start an MPI run of MRCC in a batch screen without having to interrupt the program manually.
As an alternative, I tried to compile MOLPRO and MRCC with openmp option. MRCC still runs sequentially, certainly because the following lines
nthread=omp_get_max_threads()
write (nthread_character,'(I10)') nthread
write(iopen,'(A)')'export OMP_NUM_THREADS='//
> TRIM(ADJUSTL(nthread_character))
are commented in ./src/util/mrcc_intface.F
Is there a reason for that?
Can one make MRCC run smoothly within MOLPRO with openmp?
Thanks,
Valérie
I am getting back to you concerning MOLPRO and the MPI version of MRCC. Is there a "clean" way to ask MOLPRO to stop execution, so that one can start an MPI run of MRCC in a batch screen without having to interrupt the program manually.
As an alternative, I tried to compile MOLPRO and MRCC with openmp option. MRCC still runs sequentially, certainly because the following lines
nthread=omp_get_max_threads()
write (nthread_character,'(I10)') nthread
write(iopen,'(A)')'export OMP_NUM_THREADS='//
> TRIM(ADJUSTL(nthread_character))
are commented in ./src/util/mrcc_intface.F
Is there a reason for that?
Can one make MRCC run smoothly within MOLPRO with openmp?
Thanks,
Valérie
Please Log in or Create an account to join the conversation.
- kallay
- Offline
- Administrator
- Mihaly Kallay
10 years 1 day ago #69
by kallay
Best regards,
Mihaly Kallay
Replied by kallay on topic Problems running MPI version of MRCC after MOLPRO
Dear Valérie,
I don't think there is any clean way. You can try the following. Rename dmrcc, e.g., as ddmrcc. Create a script which calls ddmrcc, copies the scratch files to the compute nodes, and then executes mrcc with mpirun. If you run molpro, it will execute this script, and everything will (hopefully) go automatically.
Concerning OpenMP: I don't know why these lines are commented out. Please try to restore them, it probably solves the problem.
I don't think there is any clean way. You can try the following. Rename dmrcc, e.g., as ddmrcc. Create a script which calls ddmrcc, copies the scratch files to the compute nodes, and then executes mrcc with mpirun. If you run molpro, it will execute this script, and everything will (hopefully) go automatically.
Concerning OpenMP: I don't know why these lines are commented out. Please try to restore them, it probably solves the problem.
Best regards,
Mihaly Kallay
Please Log in or Create an account to join the conversation.
- valerie_vallet
- Topic Author
- Offline
- New Member
Less
More
- Posts: 7
- Thank you received: 0
9 years 11 months ago #75
by valerie_vallet
Replied by valerie_vallet on topic Problems running MPI version of MRCC after MOLPRO
Dear Mihály,
I have followed your instruction.
1/ I compiled MRCC with build.mrcc Intel -pMPI
2/ I renamed dmrcc as dmrcc.exe and created a script dmrcc which contains the following:
dmrcc.exe
rsync -avP mrcc*/fort.1* mrcc*/fort.5* .
mpirun -np $NCPUS dmrcc.exe
3/ There is an error when calling goldstone:
Attempting to use an MPI routine before initializing MPI
I attach the molpro input and output files for you to see what is happening.
Thanks
I have followed your instruction.
1/ I compiled MRCC with build.mrcc Intel -pMPI
2/ I renamed dmrcc as dmrcc.exe and created a script dmrcc which contains the following:
dmrcc.exe
rsync -avP mrcc*/fort.1* mrcc*/fort.5* .
mpirun -np $NCPUS dmrcc.exe
3/ There is an error when calling goldstone:
Attempting to use an MPI routine before initializing MPI
I attach the molpro input and output files for you to see what is happening.
Thanks
Please Log in or Create an account to join the conversation.
- valerie_vallet
- Topic Author
- Offline
- New Member
Less
More
- Posts: 7
- Thank you received: 0
9 years 11 months ago #76
by valerie_vallet
Replied by valerie_vallet on topic Problems running MPI version of MRCC after MOLPRO
Here is the tar file with the input and output files
Attachments:
Please Log in or Create an account to join the conversation.
Time to create page: 0.043 seconds