Releases: ACEsuit/mace-foundations
Releases · ACEsuit/mace-foundations
mace_omol_0
MACE model trained on omol 100M with spin and charge embedding.
The model is distributed under ASL license.
Added the a model trained only on the 4M subset, can be more stable for highly negative clusters, but we recommend using MACE-omol-0-extra-large-1024.model.
mace_matpes_0
mace_omat_0
mace_mpa_0
update the readme with new models info
mace_mp_0b3
update the readme with new models info
mace_mp_0b2
fix train_file and valid_file in mace_mp_0b
mace_mp_0b
Full Changelog: mace_mp_0b...mace_mp_0b
mace-mp-0
Full Changelog: https://github.com/ACEsuit/mace-mp/commits/mace_mp_0
- MACE-MP-0 small (3,847,696 params):
2023-12-10-mace-128-L0_energy_epoch-249.model
- MACE-MP-0 medium (4,688,656 params):
2023-12-03-mace-128-L1_epoch-199.model
- MACE-MP-0 large (5,725,072 params):
2024-01-07-mace-128-L2_epoch-199.model