-
Notifications
You must be signed in to change notification settings - Fork 467
Implement interface with pyddx #2767
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
FYI rebasing onto https://github.com/TiborGY/psi4/tree/toc_dbg should allow you to get a debug build, with line numbers in GDB working. Might help with debugging the segfault. |
I think I'm going to have a look at the mysterious segfault since I've been doing quite some libmints/libint2-related debugging for #2414 etc. It should not be too hard to track down (hopefully) 😄 |
Thanks @maxscheurer 👍. Let me know if you need something beyond my reproducer. |
Unfortunately I cannot reproduce the segfault... Are you building in a conda environment w/ libint2 from the psi4 channel? |
No, I'm builing everything from source locally. I can try that though if you think that's better. |
Ok, I don't know if self-built L2 could be the reason @loriab. |
Thanks for that tip. I tried that and I get the same segfault. |
I've been able to reproduce the segfault on Linux and fix it, see #2770. |
Good catch! Thanks! |
2b4cf30
to
c51b419
Compare
I think the manual conversion was needed before L2, now it is not required anymore. Let me double-check the commit history though 😁 |
I can confirm that the |
6fbf119
to
731e4df
Compare
@hokru @loriab @maxscheurer This is ready for a first review ;). In particular I'd like to hear your thoughts on the numerical integral class and what should be changed. |
Also I'm not sure how to deal with the ambivalence between DDX and PCM in terms of the user-facing flags to enable the models and the Psi variables to store results / energy terms. PCMsolver can do PCM and COSMO, but DDX can provide domain-decomposition variants of these plus in the future linearised Poisson-Boltzmann (LPB), which is in some sense an extension to PCM. So looking ahead it feels weird to have a flag PCM to decide whether LPB is run. Similar DDX and PCMsolver will not give the same values for the solvation energy, but are still sort of doing the same thing (continuum solvation models). Any thoughts? |
2d5ad37
to
d4f0865
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks great, thanks a lot for the contribution! I will go through the interface code again in detail, but it's very clean and easy to follow.
Some things that came to mind:
- If the
elec_only
functionality is working, one could add LR-DDX for TDDFT. I can look up where the line of code needs to be added. If we want to add this, you could probably generate reference data with Gaussian, if it's not too much to ask. - Will it be possible to add nuclear gradients in a follow-up PR? I think this would be a much appreciated feature by a lot of Psi users.
- As for the numerical integration, I leave the comments to @hokru et al
☺️ - What about performance in general? Did you run some small benchmark comparing to the existing PCM implementation? Just curious 🧐
Hi @maxscheurer. Let me briefly react to your comments
Nice idea, but it is not yet been tested to the point where I would be confident in it. I have this planned as a follow-up to this one ... I'll add a todo for now.
Yes absolutely. That takes a bit of work (as more quantities are needed on the psi4-side), but ddx has them and naturally it would make sense to carry that forward to psi4.
I did not benchmark things rigorously, but e.g. on nitro-aniline in an STO-3G basis the timings were noticably different. In that setup the main load of the SCF is on the pcm. Here the SCF needs around 150s for PCMsolver and around 40s for ddx. Please take this with a grain of salt as I have done zero testing in how this scales or translates to realistic setups. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the much-wanted contribution and nice documentation and testing!
Psi likes to have a uniform interface when one can get the same value from different engines, but this has really only worked out for empirical dispersion. So the other principle is that it's fine for options to be a passthrough to the upstream project. Then PCM, PE, and DDX are effectually engine/upstream specifiers rather than the method flags that "PCM" suggests. |
Looks like there's a trivial merge conflict. Lmk if you prefer (1) I resolve with the GH interface, (2) I rebase and force-push, or (3) you want to handle it. Thanks! |
e73711b
to
ac3beb2
Compare
I just fixed it to get the tests running. |
Fine by me. |
Me, too. Only TODO is the options semi-reversion. Sorry for the extra work. |
ac3beb2
to
8567d00
Compare
Description
This PR strives to implement an interface of psi4 to the ddX library, which implements solvation models (COSMO, PCM, linearised Poisson-Boltzmann) following a domain-decomposition approach.
At its current stage I open the PR to get some feedback from devs about the suggested changes and structure and to finalise the upstream python interface of ddX. Note that this PR Is currently deliberately done on top of an outdated master, since any commit after #2388 introduces segfaults (details see below), which so far I have not yet been able to trace down. Any help on that would be much appreciated.
User API & Changelog headlines
Dev notes & details
Reproducer for the mysterious segfault
As part of rebasing against the current master I encountered a really strange segfault. I managed to produce a minimal example, which has really nothing to do with ddx and only adds a python interface to a simple numerical electrostatic integral. See here for a trimmed-down diff. On my machine checking out this
segfault
branch with0_configure.sh
, building and running themytests/runtests.sh
script gives a segfault inside the numerical integration in thePCMPotentialInt
class. Note that the code I added is not even called, the call toPCMPotentialInt
happens from the pcm code which I have not modified.Now, commenting out
PrintIntegralsFunctor printer; potential_integrals_->compute(printer);
the segfault disappears. I'm getting the weird feeling I'm doing something really stupid here and I just missed it.
Questions
Checklist
Merge to the largest extent possible PCM and DDX solvation options(I don't think this can be done easily. Leaving it for now)Status
Tagging @hokru, @loriab and @andysim who were involved in discussions about this some point (but a long time ago ... sorry for taking so long to tackle this 😄).