Skip to content

Conversation

mfherbst
Copy link
Contributor

@mfherbst mfherbst commented Nov 7, 2022

Description

This PR strives to implement an interface of psi4 to the ddX library, which implements solvation models (COSMO, PCM, linearised Poisson-Boltzmann) following a domain-decomposition approach.

At its current stage I open the PR to get some feedback from devs about the suggested changes and structure and to finalise the upstream python interface of ddX. Note that this PR Is currently deliberately done on top of an outdated master, since any commit after #2388 introduces segfaults (details see below), which so far I have not yet been able to trace down. Any help on that would be much appreciated.

User API & Changelog headlines

  • Implementation of PCM and COSMO solvation models based on the ddx library.

Dev notes & details

  • Introduction of a NumIntHelper class to compute some integrals numerically using a DFT grid
  • Introduction of ddx solvation model and new ddx options

Reproducer for the mysterious segfault

As part of rebasing against the current master I encountered a really strange segfault. I managed to produce a minimal example, which has really nothing to do with ddx and only adds a python interface to a simple numerical electrostatic integral. See here for a trimmed-down diff. On my machine checking out this segfault branch with 0_configure.sh, building and running the mytests/runtests.sh script gives a segfault inside the numerical integration in the PCMPotentialInt class. Note that the code I added is not even called, the call to PCMPotentialInt happens from the pcm code which I have not modified.

Now, commenting out

PrintIntegralsFunctor printer;
potential_integrals_->compute(printer);

the segfault disappears. I'm getting the weird feeling I'm doing something really stupid here and I just missed it.

Questions

  • Thoughts on the NumIntHelper?
  • Is D -> D_cart needed or not (this stuff)
  • Any idea on the segfault when rebasing against master?

Checklist

  • Find and fix segfaults when using psi4 master
  • Remove debug restriction to one thread in numerical integrals
  • Documentation
  • Bring forward more options to change numerical grid details for DDX solvation models
  • Merge to the largest extent possible PCM and DDX solvation options (I don't think this can be done easily. Leaving it for now)
  • Tests added for any new features
  • All or relevant fraction of full tests run

Status

  • Ready for review
  • Ready for merge

Tagging @hokru, @loriab and @andysim who were involved in discussions about this some point (but a long time ago ... sorry for taking so long to tackle this 😄).

@TiborGY
Copy link
Contributor

TiborGY commented Nov 7, 2022

FYI rebasing onto https://github.com/TiborGY/psi4/tree/toc_dbg should allow you to get a debug build, with line numbers in GDB working. Might help with debugging the segfault.

@maxscheurer
Copy link
Contributor

maxscheurer commented Nov 8, 2022

I think I'm going to have a look at the mysterious segfault since I've been doing quite some libmints/libint2-related debugging for #2414 etc. It should not be too hard to track down (hopefully) 😄

@mfherbst
Copy link
Contributor Author

mfherbst commented Nov 8, 2022

Thanks @maxscheurer 👍. Let me know if you need something beyond my reproducer.

@maxscheurer
Copy link
Contributor

Unfortunately I cannot reproduce the segfault... Are you building in a conda environment w/ libint2 from the psi4 channel?

@mfherbst
Copy link
Contributor Author

mfherbst commented Nov 8, 2022

Are you building in a conda environment w/ libint2 from the psi4 channel?

No, I'm builing everything from source locally. I can try that though if you think that's better.

@maxscheurer
Copy link
Contributor

maxscheurer commented Nov 8, 2022

No, I'm builing everything from source locally. I can try that though if you think that's better.

Ok, I don't know if self-built L2 could be the reason @loriab.
If you could create an environment conda create -n psi4_dev psi4-dev -c psi4/label/dev -c defaults --override-channels
and try building in that environment with what psi4-path-advisor --psi4-compile gives you (plus the enable PCM solver flag), that'd be worth a shot I think.

@mfherbst
Copy link
Contributor Author

mfherbst commented Nov 8, 2022

Thanks for that tip. I tried that and I get the same segfault.

@maxscheurer maxscheurer mentioned this pull request Nov 9, 2022
6 tasks
@maxscheurer
Copy link
Contributor

I've been able to reproduce the segfault on Linux and fix it, see #2770.

@mfherbst
Copy link
Contributor Author

mfherbst commented Nov 9, 2022

Good catch! Thanks!

@maxscheurer
Copy link
Contributor

Is D -> D_cart needed or not?

I think the manual conversion was needed before L2, now it is not required anymore. Let me double-check the commit history though 😁

@maxscheurer
Copy link
Contributor

I can confirm that the D_cart conversion stuff is not needed, it was removed in the L2 OEI PR here, for example, and is still the case in the current PCM implementation.

@mfherbst mfherbst force-pushed the newddx_norebase branch 2 times, most recently from 6fbf119 to 731e4df Compare November 19, 2022 19:56
@mfherbst mfherbst marked this pull request as ready for review November 20, 2022 09:36
@mfherbst
Copy link
Contributor Author

@hokru @loriab @maxscheurer This is ready for a first review ;). In particular I'd like to hear your thoughts on the numerical integral class and what should be changed.

@mfherbst
Copy link
Contributor Author

mfherbst commented Nov 24, 2022

Also I'm not sure how to deal with the ambivalence between DDX and PCM in terms of the user-facing flags to enable the models and the Psi variables to store results / energy terms.

PCMsolver can do PCM and COSMO, but DDX can provide domain-decomposition variants of these plus in the future linearised Poisson-Boltzmann (LPB), which is in some sense an extension to PCM. So looking ahead it feels weird to have a flag PCM to decide whether LPB is run. Similar DDX and PCMsolver will not give the same values for the solvation energy, but are still sort of doing the same thing (continuum solvation models). Any thoughts?

Copy link
Contributor

@maxscheurer maxscheurer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great, thanks a lot for the contribution! I will go through the interface code again in detail, but it's very clean and easy to follow.

Some things that came to mind:

  • If the elec_only functionality is working, one could add LR-DDX for TDDFT. I can look up where the line of code needs to be added. If we want to add this, you could probably generate reference data with Gaussian, if it's not too much to ask.
  • Will it be possible to add nuclear gradients in a follow-up PR? I think this would be a much appreciated feature by a lot of Psi users.
  • As for the numerical integration, I leave the comments to @hokru et al ☺️
  • What about performance in general? Did you run some small benchmark comparing to the existing PCM implementation? Just curious 🧐

@mfherbst
Copy link
Contributor Author

mfherbst commented Dec 2, 2022

Hi @maxscheurer. Let me briefly react to your comments

If the elec_only functionality is working, one could add LR-DDX for TDDFT. I can look up where the line of code needs to be added. If we want to add this, you could probably generate reference data with Gaussian, if it's not too much to ask.

Nice idea, but it is not yet been tested to the point where I would be confident in it. I have this planned as a follow-up to this one ... I'll add a todo for now.

Will it be possible to add nuclear gradients in a follow-up PR? I think this would be a much appreciated feature by a lot of Psi users.

Yes absolutely. That takes a bit of work (as more quantities are needed on the psi4-side), but ddx has them and naturally it would make sense to carry that forward to psi4.

What about performance in general? Did you run some small benchmark comparing to the existing PCM implementation?

I did not benchmark things rigorously, but e.g. on nitro-aniline in an STO-3G basis the timings were noticably different. In that setup the main load of the SCF is on the pcm. Here the SCF needs around 150s for PCMsolver and around 40s for ddx. Please take this with a grain of salt as I have done zero testing in how this scales or translates to realistic setups.

Copy link
Member

@loriab loriab left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the much-wanted contribution and nice documentation and testing!

@loriab
Copy link
Member

loriab commented Dec 4, 2022

Also I'm not sure how to deal with the ambivalence between DDX and PCM in terms of the user-facing flags to enable the models and the Psi variables to store results / energy terms.

PCMsolver can do PCM and COSMO, but DDX can provide domain-decomposition variants of these plus in the future linearised Poisson-Boltzmann (LPB), which is in some sense an extension to PCM. So looking ahead it feels weird to have a flag PCM to decide whether LPB is run. Similar DDX and PCMsolver will not give the same values for the solvation energy, but are still sort of doing the same thing (continuum solvation models). Any thoughts?

Psi likes to have a uniform interface when one can get the same value from different engines, but this has really only worked out for empirical dispersion. So the other principle is that it's fine for options to be a passthrough to the upstream project. Then PCM, PE, and DDX are effectually engine/upstream specifiers rather than the method flags that "PCM" suggests.

@loriab
Copy link
Member

loriab commented Dec 4, 2022

Looks like there's a trivial merge conflict. Lmk if you prefer (1) I resolve with the GH interface, (2) I rebase and force-push, or (3) you want to handle it. Thanks!

@mfherbst
Copy link
Contributor Author

mfherbst commented Dec 5, 2022

Looks like there's a trivial merge conflict.

I just fixed it to get the tests running.

@maxscheurer
Copy link
Contributor

Let's postpone the tighter conv checking and merge this to get it into the upcoming 1.7 release ☺️ Are you ok with this @mfherbst @loriab?

@mfherbst
Copy link
Contributor Author

mfherbst commented Dec 5, 2022

Let's postpone the tighter conv checking and merge this to get it into the upcoming 1.7 release

Fine by me.

@loriab
Copy link
Member

loriab commented Dec 5, 2022

Let's postpone the tighter conv checking and merge this to get it into the upcoming 1.7 release

Fine by me.

Me, too. Only TODO is the options semi-reversion. Sorry for the extra work.

@maxscheurer maxscheurer merged commit 8c9fef3 into psi4:master Dec 5, 2022
@loriab loriab added this to the Psi4 1.7 milestone Dec 5, 2022
@loriab loriab added the external-interface For issues about interfaces with external programs: ADCC, CheMPS2, GDMA, MRCC... label Dec 5, 2022
@mfherbst mfherbst deleted the newddx_norebase branch December 5, 2022 21:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
external-interface For issues about interfaces with external programs: ADCC, CheMPS2, GDMA, MRCC...
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants