- Posts: 10
- Thank you received: 0
If you run into troubles, it is always a good habit to report the following information:
as well as the value of the relevant environmental variables, such OMP_NUM_THREADS, etc.
This information helps us a lot when figuring out what is going on with your compilation
- the way build.mrcc was invoked
- the output of build.mrcc
- compiler version (for example: ifort -V, gfortran -v)
- blas/lapack versions
- as well as gcc and glibc versions
as well as the value of the relevant environmental variables, such OMP_NUM_THREADS, etc.
This information helps us a lot when figuring out what is going on with your compilation
MRCC with GNU + OpenMPI + PCM + XC
- phillipseeber
- Topic Author
- Offline
- New Member
Less
More
4 years 4 months ago - 4 years 4 months ago #925
by phillipseeber
MRCC with GNU + OpenMPI + PCM + XC was created by phillipseeber
Hi,
i am trying to build the current 2020 MRCC release from source with GNU compilers, OpenMPI support and the LibXC and PCMSolver interface and i am facing difficulties on many points.
I am able to build a simple version without MPI and PCM and XC after some patches with GCC 9.2.0 and MKL 2020.1.217 on a NixOS 20.03 with the following changes to build.mrcc and build.mrcc.config:
this builds successfully.
Adding OpenMPI 4.0.4 (and setting
) introduces my first errors when building with
(and many more interger size mismatches follow). Switching to -i32 does not solve this problem.
Trying to add libxc support for libxc 4.3.3, which was build with
and calling the build script as
gives linking errors:
but nm on libxc shows, that those functions are definitely there.
Trying to enable PCMSolver 1.2.3 support, which was built as
and calling the build script
gives again some kind of linking errors:
So i face problems on many ends and i am not sure what to change. Do i need specific build flags or versions of those libraries? I would appreciate any help.
Best wishes
Phillip
i am trying to build the current 2020 MRCC release from source with GNU compilers, OpenMPI support and the LibXC and PCMSolver interface and i am facing difficulties on many points.
I am able to build a simple version without MPI and PCM and XC after some patches with GCC 9.2.0 and MKL 2020.1.217 on a NixOS 20.03 with the following changes to build.mrcc and build.mrcc.config:
- in build.mrcc.config adding
Code:"-std=legacy"
- setting
Code:fortran="gfortran"Code:ccompiler="gcc"
- setting
Code:linpacklib="$mkl/lib/libmkl_gf_lp64.so $mkl/lib/libmkl_gnu_thread.so $mkl/lib/libmkl_core.so -lgomp -lpthread -lm -ldl"
Code:
./build.mrcc GNU -i64 -pOMP
Adding OpenMPI 4.0.4 (and setting
Code:
"fortranmpi="mpifort -std=legacy"
Code:
./build.mrcc GNU -i64 -pOMP -pMPI=OpenMPI
Code:
scf.f:758:72:
758 | call build_dens(c, p, sqrsize, sqroffset)
| 1
Error: Type mismatch in argument 'sqrsize' at (1); passed INTEGER(4) to INTEGER(8)
scf.f:768:58:
Trying to add libxc support for libxc 4.3.3, which was build with
Code:
cmake -Bbuild \
-DBUILD_SHARED_LIBS=ON \
-DCMAKE_BUILD_TYPE=Release \
-DBUILD_FPIC=ON \
-DENABLE_GENERIC=OFF \
-DENABLE_XHOST=OFF \
-DENABLE_FORTRAN=ON \
-DENABLE_FORTRAN03=ON \
-DCMAKE_INSTALL_PREFIX=$out
Code:
export LIBS_LIBXC=${libxc} # pointing to $out of the previous step
./build.mrcc GNU -i64 -pOMP -llibxc
Code:
Linking scf with libraries: /nix/store/gdx3cx5610mqq0zkyrxqq9k8hl83w8j5-mkl-2020.1.217/lib/libmkl_gf_lp64.so /nix/store/gdx3cx5610mqq0zkyrxqq9k8hl83w8j5-mkl-2020.1.217/lib/libmkl_gnu_thread.so /nix/store/gdx3cx5610mqq0zkyrxqq9k8hl83w8j5-mkl-2020.1.217/lib/libmkl_core.so -lgomp -lpthread -lm -ldl -L/nix/store/yb8xn6i2jck27rihwd7kk15rqpw0rv1v-libxc-4.3.3/lib -lxcf03 -lxc -lm
/nix/store/hrkc2sf2883l16d5yq3zg0y339kfw4xv-binutils-2.31.1/bin/ld: /nix/store/yb8xn6i2jck27rihwd7kk15rqpw0rv1v-libxc-4.3.3/lib/libxcf03.so: undefined reference to `xc_func_reference_get_bibtex'
/nix/store/hrkc2sf2883l16d5yq3zg0y339kfw4xv-binutils-2.31.1/bin/ld: /nix/store/yb8xn6i2jck27rihwd7kk15rqpw0rv1v-libxc-4.3.3/lib/libxcf03.so: undefined reference to `xc_func_reference_get_doi'
/nix/store/hrkc2sf2883l16d5yq3zg0y339kfw4xv-binutils-2.31.1/bin/ld: /nix/store/yb8xn6i2jck27rihwd7kk15rqpw0rv1v-libxc-4.3.3/lib/libxcf03.so: undefined reference to `xc_func_reference_get_ref'
collect2: error: ld returned 1 exit status
Trying to enable PCMSolver 1.2.3 support, which was built as
Code:
cmake -Bbuild -DCMAKE_INSTALL_PREFIX=$out -DENABLE_OPENMP=ON
Code:
export LIBS_PCM=${pcmsolver} # pointing to $out of the previous step
./build.mrcc GNU -i64 -pOMP -lpcm
Code:
gfortran: error: pcmsolver.f: No such file or directory
gfortran: warning: '-x f95-cpp-input' after last input file has no effect
gfortran: fatal error: no input files
compilation terminated.
So i face problems on many ends and i am not sure what to change. Do i need specific build flags or versions of those libraries? I would appreciate any help.
Best wishes
Phillip
Last edit: 4 years 4 months ago by phillipseeber. Reason: Something went wrong with formatting and half of the post vanished
Please Log in or Create an account to join the conversation.
- nagypeter
- Offline
- Premium Member
- MRCC developer
4 years 4 months ago #934
by nagypeter
Replied by nagypeter on topic MRCC with GNU + OpenMPI + PCM + XC
Dear Phillip,
thanks for trying the new release and sorry for the troubles.
Couple of thoughts:
1) In our (not very up to date) experience with GNU is that the performance of the resulting code is significantly behind Intel's. You will probably get better performance from the distributed binary (which has MPI, XC and PCM support compiled with Intel) than from a compilation with GNU.
2) If for some reason you want to compile from source I cannot recommend highly enough to download the Intel fortran compiler. You probably qualify for one of the free licence categories, e.g. student. Then, please follow closely sects 7-9 of the manual, e.g. recompile OpenMPI, XC & PCM with the same compiler using the recommended options. This should work.
3) We have never tried to combine a GNU compilation with the other libraries and this will probably not work and would need significant effort. This is not recommended by the manual either. The upside seems quite small (you can explain if I am wrong) that is why I gave options 1 and 2 first.
If you want to stick with GNU we cannot offer too much experience and help with that. Somewhere to start:
a) You can extend the GNU section of the build.mrcc.config file as:
preprocopt="-x f95-cpp-input -ffree-line-length-none"
compileropts="$compileropts -std=legacy"
compileroptsmpi="$compileropts"
linpacklib="-llapack -lblas"
It is also essential to compile Open MPI with 8 byte integers, add patches (see note 8 on p. 29 of the manual) and follow all OpenMPI instructions of the manual. It is probably simpler to use Intel MPI...
This might fix compilation errors with MPI, but we have no idea if the code would run well. That would require extensive testing, which is not done.
b) You would need to compile MPI, XC and PCM with the same GNU compiler version following the manuals additional instructions.
For instance, you probably overlooked installation example 3 b) on p. 30 of the manual regarding the pcmsolver.F90 file.
c) The libXC issue might be related to this: gitlab.com/libxc/libxc/-/issues/91 and an update may fix it. Or not.
Again, none of the above is tried with GNU on our side and there is no grantee that a-c) will solve everything. It will be very likely that other issues will emerge which none of us can solve with a reasonable effort, hence options 1 and 2.
I hope it works out, let us know.
Best wishes,
Peter
thanks for trying the new release and sorry for the troubles.
Couple of thoughts:
1) In our (not very up to date) experience with GNU is that the performance of the resulting code is significantly behind Intel's. You will probably get better performance from the distributed binary (which has MPI, XC and PCM support compiled with Intel) than from a compilation with GNU.
2) If for some reason you want to compile from source I cannot recommend highly enough to download the Intel fortran compiler. You probably qualify for one of the free licence categories, e.g. student. Then, please follow closely sects 7-9 of the manual, e.g. recompile OpenMPI, XC & PCM with the same compiler using the recommended options. This should work.
3) We have never tried to combine a GNU compilation with the other libraries and this will probably not work and would need significant effort. This is not recommended by the manual either. The upside seems quite small (you can explain if I am wrong) that is why I gave options 1 and 2 first.
If you want to stick with GNU we cannot offer too much experience and help with that. Somewhere to start:
a) You can extend the GNU section of the build.mrcc.config file as:
preprocopt="-x f95-cpp-input -ffree-line-length-none"
compileropts="$compileropts -std=legacy"
compileroptsmpi="$compileropts"
linpacklib="-llapack -lblas"
It is also essential to compile Open MPI with 8 byte integers, add patches (see note 8 on p. 29 of the manual) and follow all OpenMPI instructions of the manual. It is probably simpler to use Intel MPI...
This might fix compilation errors with MPI, but we have no idea if the code would run well. That would require extensive testing, which is not done.
b) You would need to compile MPI, XC and PCM with the same GNU compiler version following the manuals additional instructions.
For instance, you probably overlooked installation example 3 b) on p. 30 of the manual regarding the pcmsolver.F90 file.
c) The libXC issue might be related to this: gitlab.com/libxc/libxc/-/issues/91 and an update may fix it. Or not.
Again, none of the above is tried with GNU on our side and there is no grantee that a-c) will solve everything. It will be very likely that other issues will emerge which none of us can solve with a reasonable effort, hence options 1 and 2.
I hope it works out, let us know.
Best wishes,
Peter
Please Log in or Create an account to join the conversation.
- phillipseeber
- Topic Author
- Offline
- New Member
Less
More
- Posts: 10
- Thank you received: 0
4 years 4 months ago #936
by phillipseeber
Replied by phillipseeber on topic MRCC with GNU + OpenMPI + PCM + XC
Dear Peter,
thank you for your advice!
Regarding 1) The reason i using the GNU compilers is, that i am trying to obtain a reproducible build within the Nix infrastructure ( nixos.org/https://nixos.org/ ) to be able to integrate MRCC into our workflows easily (there is my and another repository for quantum chemistry software within Nix gitlab.com/theoretical-chemistry-jena/ni...ena/nixwithchemistry and github.com/markuskowa/NixOS-QChemhttps:/...rkuskowa/NixOS-QChem ). Now to have the MPI support i don't see another option than building from source. Please correct me if i am wrong. Nix basically supports the LLVM compilers and GNU but not any closed source compiler (the issue is already some years old github.com/NixOS/nixpkgs/issues/32434htt...nixpkgs/issues/32434 ). So if i try to stick to Nix there is no other option than GNU, unfortunately. I could just use the MRCC binaries and drop MPI as a workaround.
Regarding 2) Indeed everything is compiled with the same GCC/GFortran version.
Regarding 3) The compiler flags and so on you mention in a) are set correctly but i will try to follow your suggestions with 8 byte OpenMPI, the patches for LibXC and PCMSolver and then report back. IntelMPI is unfortunately not an option again in the Nix infrastructure.
Thank you for your help so far, i will update when i have tried all your suggestions.
Best wishes
Phillip
thank you for your advice!
Regarding 1) The reason i using the GNU compilers is, that i am trying to obtain a reproducible build within the Nix infrastructure ( nixos.org/https://nixos.org/ ) to be able to integrate MRCC into our workflows easily (there is my and another repository for quantum chemistry software within Nix gitlab.com/theoretical-chemistry-jena/ni...ena/nixwithchemistry and github.com/markuskowa/NixOS-QChemhttps:/...rkuskowa/NixOS-QChem ). Now to have the MPI support i don't see another option than building from source. Please correct me if i am wrong. Nix basically supports the LLVM compilers and GNU but not any closed source compiler (the issue is already some years old github.com/NixOS/nixpkgs/issues/32434htt...nixpkgs/issues/32434 ). So if i try to stick to Nix there is no other option than GNU, unfortunately. I could just use the MRCC binaries and drop MPI as a workaround.
Regarding 2) Indeed everything is compiled with the same GCC/GFortran version.
Regarding 3) The compiler flags and so on you mention in a) are set correctly but i will try to follow your suggestions with 8 byte OpenMPI, the patches for LibXC and PCMSolver and then report back. IntelMPI is unfortunately not an option again in the Nix infrastructure.
Thank you for your help so far, i will update when i have tried all your suggestions.
Best wishes
Phillip
Please Log in or Create an account to join the conversation.
- nagypeter
- Offline
- Premium Member
- MRCC developer
4 years 4 months ago #938
by nagypeter
Replied by nagypeter on topic MRCC with GNU + OpenMPI + PCM + XC
Dear Phillip,
thanks for sharing your plans and efforts with Nix, I hope that there will be a suitable solution.
Just some quick replies, I had no time to dig deeper in Nix and other legalities.
I think the MPI support could be feasible also with the binary. It only needs the Intel MPI Runtime library and licence, which appears to be free for anyone and might be also distributable via Nix:
software.intel.com/content/www/us/en/dev...y-licensing-faq.html
You need to be certain that is allowed both by Nix and Intel, I cannot say that they allow it for sure.
The patches are for OpenMPI and not LibXC as you will probably see.
Let us now if any of the two suggestions work out (GNU or binary + IntelMPI runtime lib).
Best wishes,
Peter
thanks for sharing your plans and efforts with Nix, I hope that there will be a suitable solution.
Just some quick replies, I had no time to dig deeper in Nix and other legalities.
I think the MPI support could be feasible also with the binary. It only needs the Intel MPI Runtime library and licence, which appears to be free for anyone and might be also distributable via Nix:
software.intel.com/content/www/us/en/dev...y-licensing-faq.html
You need to be certain that is allowed both by Nix and Intel, I cannot say that they allow it for sure.
The patches are for OpenMPI and not LibXC as you will probably see.
Let us now if any of the two suggestions work out (GNU or binary + IntelMPI runtime lib).
Best wishes,
Peter
Please Log in or Create an account to join the conversation.
- phillipseeber
- Topic Author
- Offline
- New Member
Less
More
- Posts: 10
- Thank you received: 0
4 years 4 months ago #943
by phillipseeber
Replied by phillipseeber on topic MRCC with GNU + OpenMPI + PCM + XC
Dear Peter,
thanks to your suggestions, i was able to get a working Nix derivation of MRCC including MPI support. I followed the easy way now and am using the binaries and IntelMPI. If you are interested, here would be my solution.
One minor problem is left, maybe you have a good idea? When invoking "dmrcc" on a directory containing nothing but "MINP" it crashes with
(permission denied error). I assume this is caused as everything in the nix store must be read-only (i cannot change this behaviour and set other permission on GENBAS) and GENBAS is copied with "-r--r--r--" permissions and somehow should be changed then by MRCC? When executing "dmrcc" a second time in this directory, everything runs fine and "mrccjunk/GENBAS" is there. Is there maybe some parallel IO going on? If this happens between the execution of two MRCC executables i could probably easily work around this by wrapping the second executable, which would require write access, in a script which sets correct permissions after copying. But if not i would appreciate some hint.
Redistributability is luckily not really of concern, as nix provides the "requireFile" mechanism, which allows to add the tarballs (or whatever) of non-redistributable programs on the local computer. This is the mechanism i am also using for MRCC and that NixPkgs uses for Mathematica for example.
Maybe i will have a look again later for building from source with GNU. With ARM and in the future probably also RISC-V as upcoming architectures, i would love to be able to build from source with portable compilers.
Best wishes
Phillip
thanks to your suggestions, i was able to get a working Nix derivation of MRCC including MPI support. I followed the easy way now and am using the binaries and IntelMPI. If you are interested, here would be my solution.
One minor problem is left, maybe you have a good idea? When invoking "dmrcc" on a directory containing nothing but "MINP" it crashes with
Code:
Generating atomic densities for the SCF calculation...
cp: reguläre Datei 'mrccjunk/GENBAS' kann nicht angelegt werden: Keine Berechtigung
Fatal error in cp /home/phillip/.nix-profile/bin/BASIS/H mrccjunk/GENBAS .
Program will stop.
************************ 2020-06-30 14:50:21 *************************
Error at the termination of mrcc.
**********************************************************************
Fatal error in exec integ.
Program will stop.
************************ 2020-06-30 14:50:21 *************************
Error at the termination of mrcc.
**********************************************************************
Redistributability is luckily not really of concern, as nix provides the "requireFile" mechanism, which allows to add the tarballs (or whatever) of non-redistributable programs on the local computer. This is the mechanism i am also using for MRCC and that NixPkgs uses for Mathematica for example.
Maybe i will have a look again later for building from source with GNU. With ARM and in the future probably also RISC-V as upcoming architectures, i would love to be able to build from source with portable compilers.
Best wishes
Phillip
Please Log in or Create an account to join the conversation.
- nagypeter
- Offline
- Premium Member
- MRCC developer
4 years 4 months ago #944
by nagypeter
Replied by nagypeter on topic MRCC with GNU + OpenMPI + PCM + XC
Dear Phillip,
great to hear that the MRCC binary works for you with Nix.
Is this solution available only for your group in Jena or could someone else also use it?
To the permission problem I can share some details and hints, hopefully we can figure out together.
This part of MRCC (atomic densirty computations for SCF initial guess) is a bit messy in the code. The main steps with potential relevance here are
- creating the mrccjunk in the folder where there is the MINP file
- if there is no GENBAS file besides MINP, the basis set file is copied from the BASIS file as, e.g., for hydrogen
cp $pathofmrcc/BASIS/H mrccjunk/GENBAS
This seems to be the problematic step. I think there is no writing to GENBAS after this, only reading, so I do not completely understand the permission issue.
I see in your Nix config that perhaps the BASIS and MTEST folders were put in a separate location. This could be problematic, I think MRCC assumes that there is a BASIS folder in the folder of the executables.
I mean, /home/phillip/.nix-profile/bin/BASIS does not seem to be a usual palace to store the BASIS folder. It this the location of e.g. dmrcc and the other programs as well?
You can try to follow the related steps made by MRCC in the sadscfiguess routine of integ.f.
Alternatively, you can try to mimic the commands of sadscfiguess in your shell to see where is a permission problem. E.g. can you execute the step below?
mkdir mrccjunk
cp $pathofmrcc/BASIS/H mrccjunk/GENBAS
I hope some of this helps or at least gives more clues.
Best wishes,
Peter
great to hear that the MRCC binary works for you with Nix.
Is this solution available only for your group in Jena or could someone else also use it?
To the permission problem I can share some details and hints, hopefully we can figure out together.
This part of MRCC (atomic densirty computations for SCF initial guess) is a bit messy in the code. The main steps with potential relevance here are
- creating the mrccjunk in the folder where there is the MINP file
- if there is no GENBAS file besides MINP, the basis set file is copied from the BASIS file as, e.g., for hydrogen
cp $pathofmrcc/BASIS/H mrccjunk/GENBAS
This seems to be the problematic step. I think there is no writing to GENBAS after this, only reading, so I do not completely understand the permission issue.
I see in your Nix config that perhaps the BASIS and MTEST folders were put in a separate location. This could be problematic, I think MRCC assumes that there is a BASIS folder in the folder of the executables.
I mean, /home/phillip/.nix-profile/bin/BASIS does not seem to be a usual palace to store the BASIS folder. It this the location of e.g. dmrcc and the other programs as well?
You can try to follow the related steps made by MRCC in the sadscfiguess routine of integ.f.
Alternatively, you can try to mimic the commands of sadscfiguess in your shell to see where is a permission problem. E.g. can you execute the step below?
mkdir mrccjunk
cp $pathofmrcc/BASIS/H mrccjunk/GENBAS
I hope some of this helps or at least gives more clues.
Best wishes,
Peter
Please Log in or Create an account to join the conversation.
Time to create page: 0.045 seconds