Skip to content

Science

Best of Atomistic Machine Learning

Best of Atomistic Machine Learning βš›οΈπŸ§¬πŸ’Ž

πŸ†  A ranked list of awesome atomistic machine learning (AML) projects. Updated regularly.

DOI

This curated list contains 430 awesome open-source projects with a total of 200K stars grouped into 22 categories. All projects are ranked by a project-quality score, which is calculated based on various metrics automatically collected from GitHub and different package managers. If you like to add or update projects, feel free to open an issue, submit a pull request, or directly edit the projects.yaml.

The current focus of this list is more on simulation data rather than experimental data, and more on materials rather than drug design. Nevertheless, contributions from other fields are warmly welcome!

How to cite. See the button "Cite this repository" on the right side-bar.

πŸ§™β€β™‚οΈ Discover other best-of lists or create your own.

Contents

Explanation

  • πŸ₯‡πŸ₯ˆπŸ₯‰  Combined project-quality score
  • ⭐️  Star count from GitHub
  • 🐣  New project (less than 6 months old)
  • πŸ’€  Inactive project (6 months no activity)
  • πŸ’€  Dead project (12 months no activity)
  • πŸ“ˆπŸ“‰  Project is trending up or down
  • βž•  Project was recently added
  • πŸ‘¨β€πŸ’»  Contributors count from GitHub
  • πŸ”€  Fork count from GitHub
  • πŸ“‹  Issue count from GitHub
  • ⏱️  Last update timestamp on package manager
  • πŸ“₯  Download count from package manager
  • πŸ“¦  Number of dependent projects


Active learning

Back to top

Projects that focus on enabling active learning, iterative learning schemes for atomistic ML.

FLARE (πŸ₯‡21 Β· ⭐ 300) - An open-source Python package for creating fast and accurate interatomic potentials. MIT C++ ML-IAP - [GitHub](https://github.com/mir-group/flare) (πŸ‘¨β€πŸ’» 43 Β· πŸ”€ 71 Β· πŸ“₯ 8 Β· πŸ“¦ 12 Β· πŸ“‹ 220 - 16% open Β· ⏱️ 01.11.2024):
git clone https://github.com/mir-group/flare
IPSuite (πŸ₯ˆ17 Β· ⭐ 19) - A Python toolkit for FAIR development and deployment of machine-learned interatomic potentials. EPL-2.0 ML-IAP MD workflows HTC FAIR - [GitHub](https://github.com/zincware/IPSuite) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 11 Β· πŸ“¦ 7 Β· πŸ“‹ 140 - 50% open Β· ⏱️ 17.12.2024):
git clone https://github.com/zincware/IPSuite
- [PyPi](https://pypi.org/project/ipsuite) (πŸ“₯ 500 / month Β· πŸ“¦ 2 Β· ⏱️ 04.12.2024):
pip install ipsuite
Finetuna (πŸ₯‰10 Β· ⭐ 46 Β· πŸ’€) - Active Learning for Machine Learning Potentials. MIT - [GitHub](https://github.com/ulissigroup/finetuna) (πŸ‘¨β€πŸ’» 11 Β· πŸ”€ 11 Β· πŸ“¦ 1 Β· πŸ“‹ 20 - 25% open Β· ⏱️ 15.05.2024):
git clone https://github.com/ulissigroup/finetuna
Show 3 hidden projects... - flare++ (πŸ₯ˆ13 Β· ⭐ 35 Β· πŸ’€) - A many-body extension of the FLARE code. MIT C++ ML-IAP - ACEHAL (πŸ₯‰5 Β· ⭐ 11 Β· πŸ’€) - Hyperactive Learning (HAL) Python interface for building Atomic Cluster Expansion potentials. Unlicensed Julia - ALEBREW (πŸ₯‰2 Β· ⭐ 14) - Official repository for the paper Uncertainty-biased molecular dynamics for learning uniformly accurate interatomic.. Custom ML-IAP MD


Community resources

Back to top

Projects that collect atomistic ML resources or foster communication within community.

πŸ”— AI for Science Map - Interactive mindmap of the AI4Science research field, including atomistic machine learning, including papers,..

πŸ”— Atomic Cluster Expansion - Atomic Cluster Expansion (ACE) community homepage.

πŸ”— CrystaLLM - Generate a crystal structure from a composition. language-models generative pretrained transformer

πŸ”— GAP-ML.org community homepage ML-IAP

πŸ”— matsci.org - A community forum for the discussion of anything materials science, with a focus on computational materials science..

πŸ”— Matter Modeling Stack Exchange - Machine Learning - Forum StackExchange, site Matter Modeling, ML-tagged questions.

πŸ”— ACE / GRACE support - Support forum for the Atomic Cluster Expansion (ACE) and extensions.

Best-of Machine Learning with Python (πŸ₯‡23 Β· ⭐ 18K) - A ranked list of awesome machine learning Python libraries. Updated weekly. CC-BY-4.0 general-ml Python - [GitHub](https://github.com/ml-tooling/best-of-ml-python) (πŸ‘¨β€πŸ’» 50 Β· πŸ”€ 2.5K Β· πŸ“‹ 61 - 44% open Β· ⏱️ 27.12.2024):
git clone https://github.com/ml-tooling/best-of-ml-python
OpenML (πŸ₯‡19 Β· ⭐ 680) - Open Machine Learning. BSD-3 datasets - [GitHub](https://github.com/openml/OpenML) (πŸ‘¨β€πŸ’» 35 Β· πŸ”€ 90 Β· πŸ“‹ 930 - 39% open Β· ⏱️ 07.12.2024):
git clone https://github.com/openml/OpenML
MatBench Discovery (πŸ₯‡19 Β· ⭐ 120) - An evaluation framework for machine learning models simulating high-throughput materials discovery. MIT datasets benchmarking model-repository - [GitHub](https://github.com/janosh/matbench-discovery) (πŸ‘¨β€πŸ’» 11 Β· πŸ”€ 22 Β· πŸ“¦ 4 Β· πŸ“‹ 49 - 8% open Β· ⏱️ 27.12.2024):
git clone https://github.com/janosh/matbench-discovery
- [PyPi](https://pypi.org/project/matbench-discovery) (πŸ“₯ 900 / month Β· ⏱️ 11.09.2024):
pip install matbench-discovery
Graph-based Deep Learning Literature (πŸ₯ˆ18 Β· ⭐ 4.9K) - links to conference publications in graph-based deep learning. MIT general-ml rep-learn - [GitHub](https://github.com/naganandy/graph-based-deep-learning-literature) (πŸ‘¨β€πŸ’» 12 Β· πŸ”€ 770 Β· ⏱️ 12.12.2024):
git clone https://github.com/naganandy/graph-based-deep-learning-literature
MatBench (πŸ₯ˆ17 Β· ⭐ 140 Β· πŸ’€) - Matbench: Benchmarks for materials science property prediction. MIT datasets benchmarking model-repository - [GitHub](https://github.com/materialsproject/matbench) (πŸ‘¨β€πŸ’» 25 Β· πŸ”€ 45 Β· πŸ“¦ 20 Β· πŸ“‹ 65 - 60% open Β· ⏱️ 20.01.2024):
git clone https://github.com/materialsproject/matbench
- [PyPi](https://pypi.org/project/matbench) (πŸ“₯ 330 / month Β· πŸ“¦ 2 Β· ⏱️ 27.07.2022):
pip install matbench
GT4SD - Generative Toolkit for Scientific Discovery (πŸ₯ˆ15 Β· ⭐ 340) - Gradio apps of generative models in GT4SD. MIT generative pretrained drug-discovery model-repository - [GitHub](https://github.com/GT4SD/gt4sd-core) (πŸ‘¨β€πŸ’» 20 Β· πŸ”€ 72 Β· πŸ“‹ 120 - 12% open Β· ⏱️ 12.09.2024):
git clone https://github.com/GT4SD/gt4sd-core
AI for Science Resources (πŸ₯ˆ13 Β· ⭐ 550) - List of resources for AI4Science research, including learning resources. GPL-3.0 license - [GitHub](https://github.com/divelab/AIRS) (πŸ‘¨β€πŸ’» 30 Β· πŸ”€ 63 Β· πŸ“‹ 20 - 15% open Β· ⏱️ 15.11.2024):
git clone https://github.com/divelab/AIRS
GNoME Explorer (πŸ₯ˆ10 Β· ⭐ 920) - Graph Networks for Materials Exploration Database. Apache-2 datasets materials-discovery - [GitHub](https://github.com/google-deepmind/materials_discovery) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 150 Β· πŸ“‹ 25 - 84% open Β· ⏱️ 09.12.2024):
git clone https://github.com/google-deepmind/materials_discovery
Neural-Network-Models-for-Chemistry (πŸ₯ˆ10 Β· ⭐ 100) - A collection of Nerual Network Models for chemistry. Unlicensed rep-learn - [GitHub](https://github.com/Eipgen/Neural-Network-Models-for-Chemistry) (πŸ‘¨β€πŸ’» 3 Β· πŸ”€ 16 Β· πŸ“‹ 2 - 50% open Β· ⏱️ 31.12.2024):
git clone https://github.com/Eipgen/Neural-Network-Models-for-Chemistry
Awesome Materials Informatics (πŸ₯ˆ9 Β· ⭐ 400) - Curated list of known efforts in materials informatics, i.e. in modern materials science. Custom - [GitHub](https://github.com/tilde-lab/awesome-materials-informatics) (πŸ‘¨β€πŸ’» 19 Β· πŸ”€ 85 Β· ⏱️ 18.09.2024):
git clone https://github.com/tilde-lab/awesome-materials-informatics
Awesome Neural Geometry (πŸ₯‰8 Β· ⭐ 940) - A curated collection of resources and research related to the geometry of representations in the brain, deep networks,.. Unlicensed educational rep-learn - [GitHub](https://github.com/neurreps/awesome-neural-geometry) (πŸ‘¨β€πŸ’» 12 Β· πŸ”€ 59 Β· ⏱️ 25.09.2024):
git clone https://github.com/neurreps/awesome-neural-geometry
Awesome-Crystal-GNNs (πŸ₯‰8 Β· ⭐ 76) - This repository contains a collection of resources and papers on GNN Models on Crystal Solid State Materials. MIT - [GitHub](https://github.com/kdmsit/Awesome-Crystal-GNNs) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 9 Β· ⏱️ 22.12.2024):
git clone https://github.com/kdmsit/Awesome-Crystal-GNNs
optimade.science (πŸ₯‰8 Β· ⭐ 8 Β· πŸ’€) - A sky-scanner Optimade browser-only GUI. MIT datasets - [GitHub](https://github.com/tilde-lab/optimade.science) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 2 Β· πŸ“‹ 26 - 26% open Β· ⏱️ 10.06.2024):
git clone https://github.com/tilde-lab/optimade.science
Awesome Neural SBI (πŸ₯‰7 Β· ⭐ 100) - Community-sourced list of papers and resources on neural simulation-based inference. MIT active-learning - [GitHub](https://github.com/smsharma/awesome-neural-sbi) (πŸ‘¨β€πŸ’» 3 Β· πŸ”€ 7 Β· πŸ“‹ 2 - 50% open Β· ⏱️ 23.11.2024):
git clone https://github.com/smsharma/awesome-neural-sbi
AI for Science paper collection (πŸ₯‰7 Β· ⭐ 84) - List the AI for Science papers accepted by top conferences. Apache-2 - [GitHub](https://github.com/sherrylixuecheng/AI_for_Science_paper_collection) (πŸ‘¨β€πŸ’» 5 Β· πŸ”€ 9 Β· ⏱️ 14.09.2024):
git clone https://github.com/sherrylixuecheng/AI_for_Science_paper_collection
Awesome-Graph-Generation (πŸ₯‰6 Β· ⭐ 310) - A curated list of up-to-date graph generation papers and resources. Unlicensed rep-learn - [GitHub](https://github.com/yuanqidu/awesome-graph-generation) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 19 Β· ⏱️ 14.10.2024):
git clone https://github.com/yuanqidu/awesome-graph-generation
The Collection of Database and Dataset Resources in Materials Science (πŸ₯‰6 Β· ⭐ 280) - A list of databases, datasets and books/handbooks where you can find materials properties for machine learning.. Unlicensed datasets - [GitHub](https://github.com/sedaoturak/data-resources-for-materials-science) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 48 Β· πŸ“‹ 2 - 50% open Β· ⏱️ 18.12.2024):
git clone https://github.com/sedaoturak/data-resources-for-materials-science
Show 7 hidden projects... - MoLFormers UI (πŸ₯ˆ9 Β· ⭐ 280 Β· πŸ’€) - A family of foundation models trained on chemicals. Apache-2 transformer language-models pretrained drug-discovery - A Highly Opinionated List of Open-Source Materials Informatics Resources (πŸ₯‰7 Β· ⭐ 120 Β· πŸ’€) - A Highly Opinionated List of Open Source Materials Informatics Resources. MIT - MADICES Awesome Interoperability (πŸ₯‰7 Β· ⭐ 1) - Linked data interoperability resources of the Machine-actionable data interoperability for the chemical sciences.. MIT datasets - Geometric-GNNs (πŸ₯‰4 Β· ⭐ 96 Β· πŸ’€) - List of Geometric GNNs for 3D atomic systems. Unlicensed datasets educational rep-learn - Does this material exist? (πŸ₯‰4 Β· ⭐ 15 Β· πŸ’€) - Vote on whether you think predicted crystal structures could be synthesised. MIT for-fun materials-discovery - GitHub topic materials-informatics (πŸ₯‰1) - GitHub topic materials-informatics. Unlicensed - MateriApps (πŸ₯‰1) - A Portal Site of Materials Science Simulation. Unlicensed


Datasets

Back to top

Datasets, databases and trained models for atomistic ML.

πŸ”— Alexandria Materials Database - A database of millions of theoretical crystal structures (3D, 2D and 1D) discovered by machine learning accelerated..

πŸ”— Catalysis Hub - A web-platform for sharing data and software for computational catalysis research!.

πŸ”— Citrination Datasets - AI-Powered Materials Data Platform. Open Citrination has been decommissioned.

πŸ”— crystals.ai - Curated datasets for reproducible AI in materials science.

πŸ”— DeepChem Models - DeepChem models on HuggingFace. model-repository pretrained language-models

πŸ”— Graphs of Materials Project 20190401 - The dataset used to train the MEGNet interatomic potential. ML-IAP

πŸ”— HME21 Dataset - High-temperature multi-element 2021 dataset for the PreFerred Potential (PFP).. UIP

πŸ”— JARVIS-Leaderboard ( ⭐ 62) - A large scale benchmark of materials design methods: https://www.nature.com/articles/s41524-024-01259-w. model-repository benchmarking community-resource educational

πŸ”— Materials Project - Charge Densities - Materials Project has started offering charge density information available for download via their public API.

πŸ”— Materials Project Trajectory (MPtrj) Dataset - The dataset used to train the CHGNet universal potential. UIP

πŸ”— matterverse.ai - Database of yet-to-be-sythesized materials predicted using state-of-the-art machine learning algorithms.

πŸ”— MPF.2021.2.8 - The dataset used to train the M3GNet universal potential. UIP

πŸ”— NRELMatDB - Computational materials database with the specific focus on materials for renewable energy applications including, but..

πŸ”— Quantum-Machine.org Datasets - Collection of datasets, including QM7, QM9, etc. MD, DFT. Small organic molecules, mostly.

πŸ”— sGDML Datasets - MD17, MD22, DFT datasets.

πŸ”— MoleculeNet - A Benchmark for Molecular Machine Learning. benchmarking

πŸ”— ZINC15 - A free database of commercially-available compounds for virtual screening. ZINC contains over 230 million purchasable.. graph biomolecules

πŸ”— ZINC20 - A free database of commercially-available compounds for virtual screening. ZINC contains over 230 million purchasable.. graph biomolecules

FAIR Chemistry datasets (πŸ₯‡25 Β· ⭐ 940 Β· πŸ“‰) - Datasets OC20, OC22, etc. Formerly known as Open Catalyst Project. MIT catalysis - [GitHub](https://github.com/FAIR-Chem/fairchem) (πŸ‘¨β€πŸ’» 43 Β· πŸ”€ 260 Β· πŸ“‹ 250 - 11% open Β· ⏱️ 20.12.2024):
git clone https://github.com/FAIR-Chem/fairchem
- [PyPi](https://pypi.org/project/fairchem-core) (πŸ“₯ 4.8K / month Β· πŸ“¦ 3 Β· ⏱️ 19.12.2024):
pip install fairchem-core
OPTIMADE Python tools (πŸ₯‡25 Β· ⭐ 72) - Tools for implementing and consuming OPTIMADE APIs in Python. MIT - [GitHub](https://github.com/Materials-Consortia/optimade-python-tools) (πŸ‘¨β€πŸ’» 28 Β· πŸ”€ 44 Β· πŸ“¦ 61 Β· πŸ“‹ 470 - 24% open Β· ⏱️ 27.12.2024):
git clone https://github.com/Materials-Consortia/optimade-python-tools
- [PyPi](https://pypi.org/project/optimade) (πŸ“¦ 4 Β· ⏱️ 27.12.2024):
pip install optimade
- [Conda](https://anaconda.org/conda-forge/optimade) (πŸ“₯ 100K Β· ⏱️ 28.12.2024):
conda install -c conda-forge optimade
MPContribs (πŸ₯‡22 Β· ⭐ 37 Β· πŸ“‰) - Platform for materials scientists to contribute and disseminate their materials data through Materials Project. MIT - [GitHub](https://github.com/materialsproject/MPContribs) (πŸ‘¨β€πŸ’» 25 Β· πŸ”€ 23 Β· πŸ“¦ 41 Β· πŸ“‹ 100 - 22% open Β· ⏱️ 30.12.2024):
git clone https://github.com/materialsproject/MPContribs
- [PyPi](https://pypi.org/project/mpcontribs-client) (πŸ“¦ 3 Β· ⏱️ 17.10.2024):
pip install mpcontribs-client
load-atoms (πŸ₯ˆ18 Β· ⭐ 39) - download and manipulate atomistic datasets. MIT data-structures - [GitHub](https://github.com/jla-gardner/load-atoms) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 3 Β· πŸ“¦ 5 Β· πŸ“‹ 32 - 6% open Β· ⏱️ 16.12.2024):
git clone https://github.com/jla-gardner/load-atoms
- [PyPi](https://pypi.org/project/load-atoms) (πŸ“₯ 2.3K / month Β· πŸ“¦ 2 Β· ⏱️ 13.12.2024):
pip install load-atoms
Open Databases Integration for Materials Design (OPTIMADE) (πŸ₯ˆ17 Β· ⭐ 83 Β· πŸ’€) - Specification of a common REST API for access to materials databases. CC-BY-4.0 - [GitHub](https://github.com/Materials-Consortia/OPTIMADE) (πŸ‘¨β€πŸ’» 21 Β· πŸ”€ 35 Β· πŸ“‹ 240 - 28% open Β· ⏱️ 12.06.2024):
git clone https://github.com/Materials-Consortia/OPTIMADE
Meta Open Materials 2024 (OMat24) Dataset (πŸ₯ˆ15 Β· ⭐ 930) - Contains over 100 million Density Functional Theory calculations focused on structural and compositional diversity. CC-BY-4.0 - [GitHub]() (πŸ”€ 260):
git clone https://github.com/https://github.com/FAIR-Chem/fairchem
- [PyPi](https://pypi.org/project/fairchem-core) (πŸ“₯ 4.8K / month Β· πŸ“¦ 3 Β· ⏱️ 19.12.2024):
pip install fairchem-core
QH9 (πŸ₯ˆ13 Β· ⭐ 550) - A Quantum Hamiltonian Prediction Benchmark. CC-BY-NC-SA-4.0 ML-DFT - [GitHub](https://github.com/divelab/AIRS) (πŸ‘¨β€πŸ’» 30 Β· πŸ”€ 63 Β· πŸ“‹ 20 - 15% open Β· ⏱️ 15.11.2024):
git clone https://github.com/divelab/AIRS
SPICE (πŸ₯ˆ11 Β· ⭐ 160) - A collection of QM data for training potential functions. MIT ML-IAP MD - [GitHub](https://github.com/openmm/spice-dataset) (πŸ‘¨β€πŸ’» 1 Β· πŸ”€ 9 Β· πŸ“₯ 280 Β· πŸ“‹ 69 - 24% open Β· ⏱️ 19.08.2024):
git clone https://github.com/openmm/spice-dataset
AIS Square (πŸ₯ˆ9 Β· ⭐ 13) - A collaborative and open-source platform for sharing AI for Science datasets, models, and workflows. Home of the.. LGPL-3.0 community-resource model-repository - [GitHub](https://github.com/deepmodeling/AIS-Square) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 8 Β· πŸ“‹ 6 - 83% open Β· ⏱️ 28.12.2024):
git clone https://github.com/deepmodeling/AIS-Square
Materials Data Facility (MDF) (πŸ₯ˆ9 Β· ⭐ 10 Β· πŸ’€) - A simple way to publish, discover, and access materials datasets. Publication of very large datasets supported (e.g.,.. Apache-2 - [GitHub](https://github.com/materials-data-facility/connect_client) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 1 Β· πŸ“‹ 7 - 14% open Β· ⏱️ 05.02.2024):
git clone https://github.com/materials-data-facility/connect_client
3DSC Database (πŸ₯‰6 Β· ⭐ 16) - Repo for the paper publishing the superconductor database with 3D crystal structures. Custom superconductors materials-discovery - [GitHub](https://github.com/aimat-lab/3DSC) (πŸ”€ 5 Β· πŸ“‹ 2 - 50% open Β· ⏱️ 21.11.2024):
git clone https://github.com/aimat-lab/3DSC
The Perovskite Database Project (πŸ₯‰5 Β· ⭐ 60 Β· πŸ’€) - Perovskite Database Project aims at making all perovskite device data, both past and future, available in a form.. Unlicensed community-resource - [GitHub](https://github.com/Jesperkemist/perovskitedatabase) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 20 Β· ⏱️ 07.03.2024):
git clone https://github.com/Jesperkemist/perovskitedatabase
Show 16 hidden projects... - ATOM3D (πŸ₯ˆ17 Β· ⭐ 300 Β· πŸ’€) - ATOM3D: tasks on molecules in three dimensions. MIT biomolecules benchmarking - OpenKIM (πŸ₯ˆ10 Β· ⭐ 32 Β· πŸ’€) - The Open Knowledgebase of Interatomic Models (OpenKIM) aims to be an online resource for standardized testing, long-.. LGPL-2.1 model-repository knowledge-base pretrained - 2DMD dataset (πŸ₯ˆ9 Β· ⭐ 6 Β· πŸ’€) - Code for Kazeev, N., Al-Maeeni, A.R., Romanov, I. et al. Sparse representation for machine learning the properties of.. Apache-2 material-defect - ANI-1 Dataset (πŸ₯‰8 Β· ⭐ 96 Β· πŸ’€) - A data set of 20 million calculated off-equilibrium conformations for organic molecules. MIT - MoleculeNet Leaderboard (πŸ₯‰8 Β· ⭐ 92 Β· πŸ’€) - MIT benchmarking - GEOM (πŸ₯‰7 Β· ⭐ 200 Β· πŸ’€) - GEOM: Energy-annotated molecular conformations. Unlicensed drug-discovery - ANI-1x Datasets (πŸ₯‰6 Β· ⭐ 62 Β· πŸ’€) - The ANI-1ccx and ANI-1x data sets, coupled-cluster and density functional theory properties for organic molecules. MIT - COMP6 Benchmark dataset (πŸ₯‰6 Β· ⭐ 39 Β· πŸ’€) - COMP6 Benchmark dataset for ML potentials. MIT - SciGlass (πŸ₯‰5 Β· ⭐ 12 Β· πŸ’€) - The database contains a vast set of data on the properties of glass materials. MIT - GDB-9-Ex9 and ORNL_AISD-Ex (πŸ₯‰5 Β· ⭐ 6 Β· πŸ’€) - Distributed computing workflow for generation and analysis of large scale molecular datasets obtained running multi-.. Unlicensed - linear-regression-benchmarks (πŸ₯‰5 Β· ⭐ 1 Β· πŸ’€) - Data sets used for linear regression benchmarks. MIT benchmarking single-paper - paper-data-redundancy (πŸ₯‰4 Β· ⭐ 9) - Repo for the paper Exploiting redundancy in large materials datasets for efficient machine learning with less data. BSD-3 small-data single-paper - Visual Graph Datasets (πŸ₯‰4 Β· ⭐ 2) - Datasets for the training of graph neural networks (GNNs) and subsequent visualization of attributional explanations.. MIT XAI rep-learn - OPTIMADE providers dashboard (πŸ₯‰4 Β· ⭐ 1) - A dashboard of known providers. Unlicensed - nep-data (πŸ₯‰2 Β· ⭐ 14 Β· πŸ’€) - Data related to the NEP machine-learned potential of GPUMD. Unlicensed ML-IAP MD transport-phenomena - tmQM_wB97MV Dataset (πŸ₯‰2 Β· ⭐ 6 Β· πŸ’€) - Code for Applying Large Graph Neural Networks to Predict Transition Metal Complex Energies Using the tmQM_wB97MV.. Unlicensed catalysis rep-learn


Data Structures

Back to top

Projects that focus on providing data structures used in atomistic machine learning.

dpdata (πŸ₯‡23 Β· ⭐ 200) - A Python package for manipulating atomistic data of software in computational science. LGPL-3.0 - [GitHub](https://github.com/deepmodeling/dpdata) (πŸ‘¨β€πŸ’» 61 Β· πŸ”€ 130 Β· πŸ“¦ 130 Β· πŸ“‹ 120 - 27% open Β· ⏱️ 20.09.2024):
git clone https://github.com/deepmodeling/dpdata
- [PyPi](https://pypi.org/project/dpdata) (πŸ“₯ 21K / month Β· πŸ“¦ 40 Β· ⏱️ 20.09.2024):
pip install dpdata
- [Conda](https://anaconda.org/deepmodeling/dpdata) (πŸ“₯ 250 Β· ⏱️ 27.09.2023):
conda install -c deepmodeling dpdata
Metatensor (πŸ₯ˆ22 Β· ⭐ 57) - Self-describing sparse tensor data format for atomistic machine learning and beyond. BSD-3 Rust C-lang C++ Python - [GitHub](https://github.com/metatensor/metatensor) (πŸ‘¨β€πŸ’» 26 Β· πŸ”€ 18 Β· πŸ“₯ 37K Β· πŸ“¦ 13 Β· πŸ“‹ 220 - 29% open Β· ⏱️ 19.12.2024):
git clone https://github.com/lab-cosmo/metatensor
mp-pyrho (πŸ₯‰17 Β· ⭐ 37) - Tools for re-griding volumetric quantum chemistry data for machine-learning purposes. Custom ML-DFT - [GitHub](https://github.com/materialsproject/pyrho) (πŸ‘¨β€πŸ’» 10 Β· πŸ”€ 7 Β· πŸ“¦ 26 Β· πŸ“‹ 5 - 40% open Β· ⏱️ 22.10.2024):
git clone https://github.com/materialsproject/pyrho
- [PyPi](https://pypi.org/project/mp-pyrho) (πŸ“₯ 6.6K / month Β· πŸ“¦ 5 Β· ⏱️ 22.10.2024):
pip install mp-pyrho
dlpack (πŸ₯‰15 Β· ⭐ 920) - common in-memory tensor structure. Apache-2 C++ - [GitHub](https://github.com/dmlc/dlpack) (πŸ‘¨β€πŸ’» 24 Β· πŸ”€ 130 Β· πŸ“‹ 72 - 41% open Β· ⏱️ 28.09.2024):
git clone https://github.com/dmlc/dlpack


Density functional theory (ML-DFT)

Back to top

Projects and models that focus on quantities of DFT, such as density functional approximations (ML-DFA), the charge density, density of states, the Hamiltonian, etc.

πŸ”— IKS-PIML - Code and generated data for the paper Inverting the Kohn-Sham equations with physics-informed machine learning.. neural-operator pinn datasets single-paper

JAX-DFT (πŸ₯‡25 Β· ⭐ 35K) - This library provides basic building blocks that can construct DFT calculations as a differentiable program. Apache-2 - [GitHub](https://github.com/google-research/google-research) (πŸ‘¨β€πŸ’» 820 Β· πŸ”€ 7.9K Β· πŸ“‹ 1.8K - 81% open Β· ⏱️ 13.12.2024):
git clone https://github.com/google-research/google-research
MALA (πŸ₯‡20 Β· ⭐ 82) - Materials Learning Algorithms. A framework for machine learning materials properties from first-principles data. BSD-3 - [GitHub](https://github.com/mala-project/mala) (πŸ‘¨β€πŸ’» 44 Β· πŸ”€ 26 Β· πŸ“¦ 2 Β· πŸ“‹ 290 - 10% open Β· ⏱️ 13.12.2024):
git clone https://github.com/mala-project/mala
QHNet (πŸ₯‡13 Β· ⭐ 550) - Artificial Intelligence Research for Science (AIRS). GPL-3.0 rep-learn - [GitHub](https://github.com/divelab/AIRS) (πŸ‘¨β€πŸ’» 30 Β· πŸ”€ 63 Β· πŸ“‹ 20 - 15% open Β· ⏱️ 15.11.2024):
git clone https://github.com/divelab/AIRS
SALTED (πŸ₯ˆ12 Β· ⭐ 32) - Symmetry-Adapted Learning of Three-dimensional Electron Densities. GPL-3.0 - [GitHub](https://github.com/andreagrisafi/SALTED) (πŸ‘¨β€πŸ’» 17 Β· πŸ”€ 4 Β· πŸ“‹ 7 - 28% open Β· ⏱️ 27.09.2024):
git clone https://github.com/andreagrisafi/SALTED
DeepH-pack (πŸ₯ˆ11 Β· ⭐ 250) - Deep neural networks for density functional theory Hamiltonian. LGPL-3.0 Julia - [GitHub](https://github.com/mzjb/DeepH-pack) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 44 Β· πŸ“‹ 55 - 29% open Β· ⏱️ 07.10.2024):
git clone https://github.com/mzjb/DeepH-pack
Grad DFT (πŸ₯ˆ10 Β· ⭐ 82 Β· πŸ’€) - GradDFT is a JAX-based library enabling the differentiable design and experimentation of exchange-correlation.. Apache-2 - [GitHub](https://github.com/XanaduAI/GradDFT) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 8 Β· πŸ“‹ 54 - 20% open Β· ⏱️ 13.02.2024):
git clone https://github.com/XanaduAI/GradDFT
DeePKS-kit (πŸ₯ˆ9 Β· ⭐ 100 Β· πŸ’€) - a package for developing machine learning-based chemically accurate energy and density functional models. LGPL-3.0 - [GitHub](https://github.com/deepmodeling/deepks-kit) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 36 Β· πŸ“‹ 24 - 41% open Β· ⏱️ 13.04.2024):
git clone https://github.com/deepmodeling/deepks-kit
Q-stack (πŸ₯ˆ9 Β· ⭐ 15) - Stack of codes for dedicated pre- and post-processing tasks for Quantum Machine Learning (QML). MIT excited-states general-tool - [GitHub](https://github.com/lcmd-epfl/Q-stack) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 5 Β· πŸ“‹ 29 - 27% open Β· ⏱️ 11.12.2024):
git clone https://github.com/lcmd-epfl/Q-stack
HamGNN (πŸ₯ˆ8 Β· ⭐ 72) - An E(3) equivariant Graph Neural Network for predicting electronic Hamiltonian matrix. GPL-3.0 rep-learn magnetism C-lang - [GitHub](https://github.com/QuantumLab-ZY/HamGNN) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 15 Β· πŸ“‹ 35 - 82% open Β· ⏱️ 27.12.2024):
git clone https://github.com/QuantumLab-ZY/HamGNN
ChargE3Net (πŸ₯‰5 Β· ⭐ 41) - Higher-order equivariant neural networks for charge density prediction in materials. MIT rep-learn - [GitHub](https://github.com/AIforGreatGood/charge3net) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 12 Β· πŸ“‹ 7 - 42% open Β· ⏱️ 30.10.2024):
git clone https://github.com/AIforGreatGood/charge3net
Show 22 hidden projects... - DM21 (πŸ₯‡20 Β· ⭐ 13K Β· πŸ’€) - This package provides a PySCF interface to the DM21 (DeepMind 21) family of exchange-correlation functionals described.. Apache-2 - NeuralXC (πŸ₯ˆ10 Β· ⭐ 34 Β· πŸ’€) - Implementation of a machine learned density functional. BSD-3 - ACEhamiltonians (πŸ₯ˆ10 Β· ⭐ 15 Β· πŸ’€) - Provides tools for constructing, fitting, and predicting self-consistent Hamiltonian and overlap matrices in solid-.. MIT Julia - PROPhet (πŸ₯ˆ9 Β· ⭐ 64 Β· πŸ’€) - PROPhet is a code to integrate machine learning techniques with first-principles quantum chemistry approaches. GPL-3.0 ML-IAP MD single-paper C++ - DeepH-E3 (πŸ₯‰7 Β· ⭐ 83 Β· πŸ’€) - General framework for E(3)-equivariant neural network representation of density functional theory Hamiltonian. MIT magnetism - Mat2Spec (πŸ₯‰7 Β· ⭐ 28 Β· πŸ’€) - Density of States Prediction for Materials Discovery via Contrastive Learning from Probabilistic Embeddings. MIT spectroscopy - Libnxc (πŸ₯‰7 Β· ⭐ 17 Β· πŸ’€) - A library for using machine-learned exchange-correlation functionals for density-functional theory. MPL-2.0 C++ Fortran - DeepDFT (πŸ₯‰6 Β· ⭐ 66 Β· πŸ’€) - Official implementation of DeepDFT model. MIT - charge-density-models (πŸ₯‰6 Β· ⭐ 10 Β· πŸ’€) - Tools to build charge density models using [fairchem](https://github.com/FAIR-Chem/fairchem). MIT rep-learn - KSR-DFT (πŸ₯‰6 Β· ⭐ 4 Β· πŸ’€) - Kohn-Sham regularizer for machine-learned DFT functionals. Apache-2 - xDeepH (πŸ₯‰5 Β· ⭐ 34 Β· πŸ’€) - Extended DeepH (xDeepH) method for magnetic materials. LGPL-3.0 magnetism Julia - ML-DFT (πŸ₯‰5 Β· ⭐ 23 Β· πŸ’€) - A package for density functional approximation using machine learning. MIT - InfGCN for Electron Density Estimation (πŸ₯‰5 Β· ⭐ 12 Β· πŸ’€) - Official implementation of the NeurIPS 23 spotlight paper of InfGCN. MIT rep-learn neural-operator - rho_learn (πŸ₯‰5 Β· ⭐ 4 Β· πŸ’€) - A proof-of-concept workflow for torch-based electron density learning. MIT - DeepCDP (πŸ₯‰4 Β· ⭐ 6 Β· πŸ’€) - DeepCDP: Deep learning Charge Density Prediction. Unlicensed - gprep (πŸ₯‰4 Β· πŸ’€) - Fitting DFTB repulsive potentials with GPR. MIT single-paper - APET (πŸ₯‰3 Β· ⭐ 4 Β· πŸ’€) - Atomic Positional Embedding-based Transformer. GPL-3.0 density-of-states transformer - CSNN (πŸ₯‰3 Β· ⭐ 2 Β· πŸ’€) - Primary codebase of CSNN - Concentric Spherical Neural Network for 3D Representation Learning. BSD-3 - MALADA (πŸ₯‰3 Β· ⭐ 1) - MALA Data Acquisition: Helpful tools to build data for MALA. BSD-3 - A3MD (πŸ₯‰2 Β· ⭐ 8 Β· πŸ’€) - MPNN-like + Analytic Density Model = Accurate electron densities. Unlicensed rep-learn single-paper - MLDensity (πŸ₯‰1 Β· ⭐ 3 Β· πŸ’€) - Linear Jacobi-Legendre expansion of the charge density for machine learning-accelerated electronic structure.. Unlicensed - kdft (πŸ₯‰1 Β· ⭐ 2 Β· πŸ’€) - The Kernel Density Functional (KDF) code allows generating ML based DFT functionals. Unlicensed


Educational Resources

Back to top

Tutorials, guides, cookbooks, recipes, etc.

πŸ”— AI for Science 101 community-resource rep-learn

πŸ”— AL4MS 2023 workshop tutorials active-learning

πŸ”— Quantum Chemistry in the Age of Machine Learning - Book, 2022.

AI4Chemistry course (πŸ₯‡11 Β· ⭐ 160 Β· πŸ’€) - EPFL AI for chemistry course, Spring 2023. https://schwallergroup.github.io/ai4chem_course. MIT chemistry - [GitHub](https://github.com/schwallergroup/ai4chem_course) (πŸ‘¨β€πŸ’» 6 Β· πŸ”€ 37 Β· πŸ“‹ 4 - 25% open Β· ⏱️ 02.05.2024):
git clone https://github.com/schwallergroup/ai4chem_course
jarvis-tools-notebooks (πŸ₯ˆ9 Β· ⭐ 70) - A Google-Colab Notebook Collection for Materials Design: https://jarvis.nist.gov/. NIST - [GitHub](https://github.com/JARVIS-Materials-Design/jarvis-tools-notebooks) (πŸ‘¨β€πŸ’» 5 Β· πŸ”€ 26 Β· ⏱️ 14.08.2024):
git clone https://github.com/JARVIS-Materials-Design/jarvis-tools-notebooks
DSECOP (πŸ₯ˆ9 Β· ⭐ 44 Β· πŸ’€) - This repository contains data science educational materials developed by DSECOP Fellows. CCO-1.0 - [GitHub](https://github.com/GDS-Education-Community-of-Practice/DSECOP) (πŸ‘¨β€πŸ’» 14 Β· πŸ”€ 26 Β· πŸ“‹ 8 - 12% open Β· ⏱️ 26.06.2024):
git clone https://github.com/GDS-Education-Community-of-Practice/DSECOP
iam-notebooks (πŸ₯ˆ8 Β· ⭐ 26) - Jupyter notebooks for the lectures of the Introduction to Atomistic Modeling. Apache-2 - [GitHub](https://github.com/ceriottm/iam-notebooks) (πŸ‘¨β€πŸ’» 6 Β· πŸ”€ 5 Β· ⏱️ 09.10.2024):
git clone https://github.com/ceriottm/iam-notebooks
COSMO Software Cookbook (πŸ₯ˆ8 Β· ⭐ 17) - A cookbook with recipes for atomic-scale modeling of materials and molecules. BSD-3 - [GitHub](https://github.com/lab-cosmo/atomistic-cookbook) (πŸ‘¨β€πŸ’» 11 Β· πŸ”€ 1 Β· πŸ“‹ 12 - 8% open Β· ⏱️ 20.12.2024):
git clone https://github.com/lab-cosmo/software-cookbook
MACE-tutorials (πŸ₯‰6 Β· ⭐ 43) - Another set of tutorials for the MACE interatomic potential by one of the authors. MIT ML-IAP rep-learn MD - [GitHub](https://github.com/ilyes319/mace-tutorials) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 11 Β· ⏱️ 16.07.2024):
git clone https://github.com/ilyes319/mace-tutorials
Show 19 hidden projects... - Geometric GNN Dojo (πŸ₯‡12 Β· ⭐ 480 Β· πŸ’€) - New to geometric GNNs: try our practical notebook, prepared for MPhil students at the University of Cambridge. MIT rep-learn - DeepLearningLifeSciences (πŸ₯‡12 Β· ⭐ 360 Β· πŸ’€) - Example code from the book Deep Learning for the Life Sciences. MIT - Deep Learning for Molecules and Materials Book (πŸ₯‡11 Β· ⭐ 630 Β· πŸ’€) - Deep learning for molecules and materials book. Custom - OPTIMADE Tutorial Exercises (πŸ₯ˆ9 Β· ⭐ 15 Β· πŸ’€) - Tutorial exercises for the OPTIMADE API. MIT datasets - RDKit Tutorials (πŸ₯ˆ8 Β· ⭐ 270 Β· πŸ’€) - Tutorials to learn how to work with the RDKit. Custom - BestPractices (πŸ₯ˆ8 Β· ⭐ 180 Β· πŸ’€) - Things that you should (and should not) do in your Materials Informatics research. MIT - MAChINE (πŸ₯‰7 Β· ⭐ 1 Β· πŸ’€) - Client-Server Web App to introduce usage of ML in materials science to beginners. MIT - Applied AI for Materials (πŸ₯‰6 Β· ⭐ 59 Β· πŸ’€) - Course materials for Applied AI for Materials Science and Engineering. Unlicensed - ML for catalysis tutorials (πŸ₯‰6 Β· ⭐ 8 Β· πŸ’€) - A jupyter book repo for tutorial on how to use OCP ML models for catalysis. MIT - AI4Science101 (πŸ₯‰5 Β· ⭐ 86 Β· πŸ’€) - AI for Science. Unlicensed - Machine Learning for Materials Hard and Soft (πŸ₯‰5 Β· ⭐ 35 Β· πŸ’€) - ESI-DCAFM-TACO-VDSP Summer School on Machine Learning for Materials Hard and Soft. Unlicensed - Data Handling, DoE and Statistical Analysis for Material Chemists (πŸ₯‰5 Β· ⭐ 2 Β· πŸ’€) - Notebooks for workshops of DoE course, hosted by the Computational Materials Chemistry group at Uppsala University. GPL-3.0 - ML-in-chemistry-101 (πŸ₯‰4 Β· ⭐ 72 Β· πŸ’€) - The course materials for Machine Learning in Chemistry 101. Unlicensed - chemrev-gpr (πŸ₯‰4 Β· ⭐ 10 Β· πŸ’€) - Notebooks accompanying the paper on GPR in materials and molecules in Chemical Reviews 2020. Unlicensed - PiNN Lab (πŸ₯‰4 Β· ⭐ 3 Β· πŸ’€) - Material for running a lab session on atomic neural networks. GPL-3.0 - AI4ChemMat Hands-On Series (πŸ₯‰4 Β· ⭐ 1 Β· πŸ’€) - Hands-On Series organized by Chemistry and Materials working group at Argonne Nat Lab. MPL-2.0 - MLDensity_tutorial (πŸ₯‰2 Β· ⭐ 9 Β· πŸ’€) - Tutorial files to work with ML for the charge density in molecules and solids. Unlicensed - LAMMPS-style pair potentials with GAP (πŸ₯‰2 Β· ⭐ 4 Β· πŸ’€) - A tutorial on how to create LAMMPS-style pair potentials and use them in combination with GAP potentials to run MD.. Unlicensed ML-IAP MD rep-eng - MALA Tutorial (πŸ₯‰2 Β· ⭐ 2 Β· πŸ’€) - A full MALA hands-on tutorial. Unlicensed


Explainable Artificial intelligence (XAI)

Back to top

Projects that focus on explainability and model interpretability in atomistic ML.

exmol (πŸ₯‡21 Β· ⭐ 290) - Explainer for black box models that predict molecule properties. MIT - [GitHub](https://github.com/ur-whitelab/exmol) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 42 Β· πŸ“¦ 23 Β· πŸ“‹ 71 - 16% open Β· ⏱️ 22.11.2024):
git clone https://github.com/ur-whitelab/exmol
- [PyPi](https://pypi.org/project/exmol) (πŸ“₯ 1.3K / month Β· πŸ“¦ 1 Β· ⏱️ 22.11.2024):
pip install exmol
MEGAN: Multi Explanation Graph Attention Student (πŸ₯‰5 Β· ⭐ 8) - Minimal implementation of graph attention student model architecture. MIT rep-learn - [GitHub](https://github.com/aimat-lab/graph_attention_student) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 1 Β· πŸ“‹ 3 - 33% open Β· ⏱️ 07.10.2024):
git clone https://github.com/aimat-lab/graph_attention_student
Show 1 hidden projects... - Linear vs blackbox (πŸ₯‰3 Β· ⭐ 2 Β· πŸ’€) - Code and data related to the publication: Interpretable models for extrapolation in scientific machine learning. MIT XAI single-paper rep-eng


Electronic structure methods (ML-ESM)

Back to top

Projects and models that focus on quantities of electronic structure methods, which do not fit into either of the categories ML-WFT or ML-DFT.

Show 5 hidden projects... - QDF for molecule (πŸ₯‡8 Β· ⭐ 210 Β· πŸ’€) - Quantum deep field: data-driven wave function, electron density generation, and energy prediction and extrapolation.. MIT - QMLearn (πŸ₯ˆ5 Β· ⭐ 11 Β· πŸ’€) - Quantum Machine Learning by learning one-body reduced density matrices in the AO basis... MIT - q-pac (πŸ₯ˆ5 Β· ⭐ 4 Β· πŸ’€) - Kernel charge equilibration method. MIT electrostatics - halex (πŸ₯ˆ5 Β· ⭐ 3 Β· πŸ’€) - Hamiltonian Learning for Excited States https://doi.org/10.48550/arXiv.2311.00844. Unlicensed excited-states - e3psi (πŸ₯‰3 Β· ⭐ 3 Β· πŸ’€) - Equivariant machine learning library for learning from electronic structures. LGPL-3.0


General Tools

Back to top

General tools for atomistic machine learning.

RDKit (πŸ₯‡36 Β· ⭐ 2.7K) - BSD-3 C++ - [GitHub](https://github.com/rdkit/rdkit) (πŸ‘¨β€πŸ’» 240 Β· πŸ”€ 880 Β· πŸ“₯ 870 Β· πŸ“¦ 3 Β· πŸ“‹ 3.7K - 18% open Β· ⏱️ 25.12.2024):
git clone https://github.com/rdkit/rdkit
- [PyPi](https://pypi.org/project/rdkit) (πŸ“₯ 1.4M / month Β· πŸ“¦ 840 Β· ⏱️ 29.12.2024):
pip install rdkit
- [Conda](https://anaconda.org/rdkit/rdkit) (πŸ“₯ 2.6M Β· ⏱️ 16.06.2023):
conda install -c rdkit rdkit
DeepChem (πŸ₯‡34 Β· ⭐ 5.6K) - Democratizing Deep-Learning for Drug Discovery, Quantum Chemistry, Materials Science and Biology. MIT - [GitHub](https://github.com/deepchem/deepchem) (πŸ‘¨β€πŸ’» 250 Β· πŸ”€ 1.7K Β· πŸ“¦ 480 Β· πŸ“‹ 1.9K - 34% open Β· ⏱️ 24.12.2024):
git clone https://github.com/deepchem/deepchem
- [PyPi](https://pypi.org/project/deepchem) (πŸ“₯ 51K / month Β· πŸ“¦ 14 Β· ⏱️ 24.12.2024):
pip install deepchem
- [Conda](https://anaconda.org/conda-forge/deepchem) (πŸ“₯ 110K Β· ⏱️ 05.04.2024):
conda install -c conda-forge deepchem
- [Docker Hub](https://hub.docker.com/r/deepchemio/deepchem) (πŸ“₯ 8K Β· ⭐ 5 Β· ⏱️ 24.12.2024):
docker pull deepchemio/deepchem
Matminer (πŸ₯‡28 Β· ⭐ 490) - Data mining for materials science. Custom - [GitHub](https://github.com/hackingmaterials/matminer) (πŸ‘¨β€πŸ’» 56 Β· πŸ”€ 190 Β· πŸ“¦ 350 Β· πŸ“‹ 230 - 13% open Β· ⏱️ 11.10.2024):
git clone https://github.com/hackingmaterials/matminer
- [PyPi](https://pypi.org/project/matminer) (πŸ“₯ 15K / month Β· πŸ“¦ 60 Β· ⏱️ 06.10.2024):
pip install matminer
- [Conda](https://anaconda.org/conda-forge/matminer) (πŸ“₯ 78K Β· ⏱️ 21.12.2024):
conda install -c conda-forge matminer
QUIP (πŸ₯ˆ24 Β· ⭐ 360) - libAtoms/QUIP molecular dynamics framework: https://libatoms.github.io. GPL-2.0 MD ML-IAP rep-eng Fortran - [GitHub](https://github.com/libAtoms/QUIP) (πŸ‘¨β€πŸ’» 85 Β· πŸ”€ 120 Β· πŸ“₯ 730 Β· πŸ“¦ 45 Β· πŸ“‹ 470 - 22% open Β· ⏱️ 27.09.2024):
git clone https://github.com/libAtoms/QUIP
- [PyPi](https://pypi.org/project/quippy-ase) (πŸ“₯ 2.6K / month Β· πŸ“¦ 4 Β· ⏱️ 15.01.2023):
pip install quippy-ase
- [Docker Hub](https://hub.docker.com/r/libatomsquip/quip) (πŸ“₯ 10K Β· ⭐ 4 Β· ⏱️ 24.04.2023):
docker pull libatomsquip/quip
JARVIS-Tools (πŸ₯ˆ23 Β· ⭐ 320) - JARVIS-Tools: an open-source software package for data-driven atomistic materials design. Publications:.. Custom - [GitHub](https://github.com/usnistgov/jarvis) (πŸ‘¨β€πŸ’» 15 Β· πŸ”€ 120 Β· πŸ“¦ 110 Β· πŸ“‹ 92 - 51% open Β· ⏱️ 20.11.2024):
git clone https://github.com/usnistgov/jarvis
- [PyPi](https://pypi.org/project/jarvis-tools) (πŸ“₯ 19K / month Β· πŸ“¦ 31 Β· ⏱️ 20.11.2024):
pip install jarvis-tools
- [Conda](https://anaconda.org/conda-forge/jarvis-tools) (πŸ“₯ 87K Β· ⏱️ 20.11.2024):
conda install -c conda-forge jarvis-tools
MAML (πŸ₯ˆ21 Β· ⭐ 380) - Python for Materials Machine Learning, Materials Descriptors, Machine Learning Force Fields, Deep Learning, etc. BSD-3 - [GitHub](https://github.com/materialsvirtuallab/maml) (πŸ‘¨β€πŸ’» 33 Β· πŸ”€ 79 Β· πŸ“¦ 12 Β· πŸ“‹ 71 - 12% open Β· ⏱️ 06.11.2024):
git clone https://github.com/materialsvirtuallab/maml
- [PyPi](https://pypi.org/project/maml) (πŸ“₯ 460 / month Β· πŸ“¦ 2 Β· ⏱️ 13.06.2024):
pip install maml
MAST-ML (πŸ₯ˆ19 Β· ⭐ 110) - MAterials Simulation Toolkit for Machine Learning (MAST-ML). MIT - [GitHub](https://github.com/uw-cmg/MAST-ML) (πŸ‘¨β€πŸ’» 19 Β· πŸ”€ 61 Β· πŸ“₯ 140 Β· πŸ“¦ 45 Β· πŸ“‹ 220 - 14% open Β· ⏱️ 09.10.2024):
git clone https://github.com/uw-cmg/MAST-ML
QML (πŸ₯ˆ18 Β· ⭐ 200) - QML: Quantum Machine Learning. MIT - [GitHub](https://github.com/qmlcode/qml) (πŸ‘¨β€πŸ’» 10 Β· πŸ”€ 84 Β· πŸ“‹ 59 - 64% open Β· ⏱️ 08.12.2024):
git clone https://github.com/qmlcode/qml
- [PyPi](https://pypi.org/project/qml) (πŸ“₯ 390 / month Β· ⏱️ 13.08.2018):
pip install qml
Scikit-Matter (πŸ₯ˆ17 Β· ⭐ 77) - A collection of scikit-learn compatible utilities that implement methods born out of the materials science and.. BSD-3 scikit-learn - [GitHub](https://github.com/scikit-learn-contrib/scikit-matter) (πŸ‘¨β€πŸ’» 15 Β· πŸ”€ 19 Β· πŸ“¦ 11 Β· πŸ“‹ 70 - 20% open Β· ⏱️ 09.10.2024):
git clone https://github.com/scikit-learn-contrib/scikit-matter
- [PyPi](https://pypi.org/project/skmatter) (πŸ“₯ 1.8K / month Β· ⏱️ 24.08.2023):
pip install skmatter
- [Conda](https://anaconda.org/conda-forge/skmatter) (πŸ“₯ 2.6K Β· ⏱️ 24.08.2023):
conda install -c conda-forge skmatter
MLatom (πŸ₯‰16 Β· ⭐ 72) - AI-enhanced computational chemistry. MIT UIP ML-IAP MD ML-DFT ML-ESM transfer-learning active-learning spectroscopy structure-optimization - [GitHub](https://github.com/dralgroup/mlatom) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 11 Β· πŸ“‹ 5 - 20% open Β· ⏱️ 18.12.2024):
git clone https://github.com/dralgroup/mlatom
- [PyPi](https://pypi.org/project/mlatom) (πŸ“₯ 3.4K / month Β· ⏱️ 18.12.2024):
pip install mlatom
XenonPy (πŸ₯‰15 Β· ⭐ 140 Β· πŸ’€) - XenonPy is a Python Software for Materials Informatics. BSD-3 - [GitHub](https://github.com/yoshida-lab/XenonPy) (πŸ‘¨β€πŸ’» 10 Β· πŸ”€ 61 Β· πŸ“₯ 1.5K Β· πŸ“‹ 87 - 24% open Β· ⏱️ 21.04.2024):
git clone https://github.com/yoshida-lab/XenonPy
- [PyPi](https://pypi.org/project/xenonpy) (πŸ“₯ 890 / month Β· πŸ“¦ 1 Β· ⏱️ 31.10.2022):
pip install xenonpy
Artificial Intelligence for Science (AIRS) (πŸ₯‰13 Β· ⭐ 550) - Artificial Intelligence Research for Science (AIRS). GPL-3.0 license rep-learn generative ML-IAP MD ML-DFT ML-WFT biomolecules - [GitHub](https://github.com/divelab/AIRS) (πŸ‘¨β€πŸ’» 30 Β· πŸ”€ 63 Β· πŸ“‹ 20 - 15% open Β· ⏱️ 15.11.2024):
git clone https://github.com/divelab/AIRS
Show 10 hidden projects... - Automatminer (πŸ₯‰15 Β· ⭐ 140 Β· πŸ’€) - An automatic engine for predicting materials properties. Custom autoML - AMPtorch (πŸ₯‰11 Β· ⭐ 60 Β· πŸ’€) - AMPtorch: Atomistic Machine Learning Package (AMP) - PyTorch. GPL-3.0 - OpenChem (πŸ₯‰10 Β· ⭐ 680 Β· πŸ’€) - OpenChem: Deep Learning toolkit for Computational Chemistry and Drug Design Research. MIT - JAXChem (πŸ₯‰7 Β· ⭐ 79 Β· πŸ’€) - JAXChem is a JAX-based deep learning library for complex and versatile chemical modeling. MIT - uncertainty_benchmarking (πŸ₯‰7 Β· ⭐ 41 Β· πŸ’€) - Various code/notebooks to benchmark different ways we could estimate uncertainty in ML predictions. Unlicensed benchmarking probabilistic - torchchem (πŸ₯‰7 Β· ⭐ 35 Β· πŸ’€) - An experimental repo for experimenting with PyTorch models. MIT - Equisolve (πŸ₯‰6 Β· ⭐ 5 Β· πŸ’€) - A ML toolkit package utilizing the metatensor data format to build models for the prediction of equivariant properties.. BSD-3 ML-IAP - ACEatoms (πŸ₯‰4 Β· ⭐ 2 Β· πŸ’€) - Generic code for modelling atomic properties using ACE. Custom Julia - Magpie (πŸ₯‰3) - Materials Agnostic Platform for Informatics and Exploration (Magpie). MIT Java - quantum-structure-ml (πŸ₯‰2 Β· ⭐ 2 Β· πŸ’€) - Multi-class classification model for predicting the magnetic order of magnetic structures and a binary classification.. Unlicensed magnetism benchmarking


Generative Models

Back to top

Projects that implement generative models for atomistic ML.

GT4SD (πŸ₯‡18 Β· ⭐ 340 Β· πŸ“ˆ) - GT4SD, an open-source library to accelerate hypothesis generation in the scientific discovery process. MIT pretrained drug-discovery rep-learn - [GitHub](https://github.com/GT4SD/gt4sd-core) (πŸ‘¨β€πŸ’» 20 Β· πŸ”€ 72 Β· πŸ“‹ 120 - 12% open Β· ⏱️ 12.09.2024):
git clone https://github.com/GT4SD/gt4sd-core
- [PyPi](https://pypi.org/project/gt4sd) (πŸ“₯ 2.4K / month Β· ⏱️ 12.09.2024):
pip install gt4sd
MoLeR (πŸ₯‡15 Β· ⭐ 280 Β· πŸ’€) - Implementation of MoLeR: a generative model of molecular graphs which supports scaffold-constrained generation. MIT - [GitHub](https://github.com/microsoft/molecule-generation) (πŸ‘¨β€πŸ’» 5 Β· πŸ”€ 41 Β· πŸ“‹ 40 - 22% open Β· ⏱️ 03.01.2024):
git clone https://github.com/microsoft/molecule-generation
- [PyPi](https://pypi.org/project/molecule-generation) (πŸ“₯ 240 / month Β· πŸ“¦ 1 Β· ⏱️ 05.01.2024):
pip install molecule-generation
PMTransformer (πŸ₯ˆ14 Β· ⭐ 89 Β· πŸ’€) - Universal Transfer Learning in Porous Materials, including MOFs. MIT transfer-learning pretrained transformer - [GitHub](https://github.com/hspark1212/MOFTransformer) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 13 Β· πŸ“¦ 8 Β· ⏱️ 20.06.2024):
git clone https://github.com/hspark1212/MOFTransformer
- [PyPi](https://pypi.org/project/moftransformer) (πŸ“₯ 570 / month Β· πŸ“¦ 1 Β· ⏱️ 20.06.2024):
pip install moftransformer
SiMGen (πŸ₯ˆ13 Β· ⭐ 17) - Zero Shot Molecular Generation via Similarity Kernels. MIT viz - [GitHub](https://github.com/RokasEl/simgen) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 2 Β· πŸ“¦ 2 Β· πŸ“‹ 4 - 25% open Β· ⏱️ 13.12.2024):
git clone https://github.com/RokasEl/simgen
- [PyPi](https://pypi.org/project/simgen) (πŸ“₯ 200 / month Β· ⏱️ 13.12.2024):
pip install simgen
SchNetPack G-SchNet (πŸ₯ˆ12 Β· ⭐ 52) - G-SchNet extension for SchNetPack. MIT - [GitHub](https://github.com/atomistic-machine-learning/schnetpack-gschnet) (πŸ‘¨β€πŸ’» 3 Β· πŸ”€ 8 Β· πŸ“‹ 16 - 6% open Β· ⏱️ 07.11.2024):
git clone https://github.com/atomistic-machine-learning/schnetpack-gschnet
COATI (πŸ₯‰5 Β· ⭐ 100 Β· πŸ’€) - COATI: multi-modal contrastive pre-training for representing and traversing chemical space. Apache-2 drug-discovery multimodal pretrained rep-learn - [GitHub](https://github.com/terraytherapeutics/COATI) (πŸ‘¨β€πŸ’» 5 Β· πŸ”€ 6 Β· πŸ“‹ 3 - 33% open Β· ⏱️ 23.03.2024):
git clone https://github.com/terraytherapeutics/COATI
Show 8 hidden projects... - synspace (πŸ₯ˆ12 Β· ⭐ 36 Β· πŸ’€) - Synthesis generative model. MIT - EDM (πŸ₯‰9 Β· ⭐ 460 Β· πŸ’€) - E(3) Equivariant Diffusion Model for Molecule Generation in 3D. MIT - G-SchNet (πŸ₯‰8 Β· ⭐ 130 Β· πŸ’€) - G-SchNet - a generative model for 3d molecular structures. MIT - bVAE-IM (πŸ₯‰8 Β· ⭐ 11 Β· πŸ’€) - Implementation of Chemical Design with GPU-based Ising Machine. MIT QML single-paper - cG-SchNet (πŸ₯‰7 Β· ⭐ 54 Β· πŸ’€) - cG-SchNet - a conditional generative neural network for 3d molecular structures. MIT - rxngenerator (πŸ₯‰6 Β· ⭐ 12 Β· πŸ’€) - A generative model for molecular generation via multi-step chemical reactions. MIT - MolSLEPA (πŸ₯‰5 Β· ⭐ 5 Β· πŸ’€) - Interpretable Fragment-based Molecule Design with Self-learning Entropic Population Annealing. MIT XAI - Mapping out phase diagrams with generative classifiers (πŸ₯‰4 Β· ⭐ 7 Β· πŸ’€) - Repository for our ``Mapping out phase diagrams with generative models paper. MIT phase-transition


Interatomic Potentials (ML-IAP)

Back to top

Machine learning interatomic potentials (aka ML-IAP, MLIAP, MLIP, MLP) and force fields (ML-FF) for molecular dynamics.

DeePMD-kit (πŸ₯‡28 Β· ⭐ 1.5K) - A deep learning package for many-body potential energy representation and molecular dynamics. LGPL-3.0 C++ - [GitHub](https://github.com/deepmodeling/deepmd-kit) (πŸ‘¨β€πŸ’» 73 Β· πŸ”€ 520 Β· πŸ“₯ 46K Β· πŸ“¦ 22 Β· πŸ“‹ 870 - 10% open Β· ⏱️ 23.12.2024):
git clone https://github.com/deepmodeling/deepmd-kit
- [PyPi](https://pypi.org/project/deepmd-kit) (πŸ“₯ 6K / month Β· πŸ“¦ 4 Β· ⏱️ 23.12.2024):
pip install deepmd-kit
- [Conda](https://anaconda.org/deepmodeling/deepmd-kit) (πŸ“₯ 1.7K Β· ⏱️ 06.04.2024):
conda install -c deepmodeling deepmd-kit
- [Docker Hub](https://hub.docker.com/r/deepmodeling/deepmd-kit) (πŸ“₯ 3.3K Β· ⭐ 1 Β· ⏱️ 25.11.2024):
docker pull deepmodeling/deepmd-kit
fairchem (πŸ₯‡25 Β· ⭐ 940) - FAIR Chemistrys library of machine learning methods for chemistry. Formerly known as Open Catalyst Project. MIT pretrained UIP rep-learn catalysis - [GitHub](https://github.com/FAIR-Chem/fairchem) (πŸ‘¨β€πŸ’» 43 Β· πŸ”€ 260 Β· πŸ“‹ 250 - 11% open Β· ⏱️ 20.12.2024):
git clone https://github.com/FAIR-Chem/fairchem
- [PyPi](https://pypi.org/project/fairchem-core) (πŸ“₯ 4.8K / month Β· πŸ“¦ 3 Β· ⏱️ 19.12.2024):
pip install fairchem-core
DP-GEN (πŸ₯‡23 Β· ⭐ 320) - The deep potential generator to generate a deep-learning based model of interatomic potential energy and force field. LGPL-3.0 workflows - [GitHub](https://github.com/deepmodeling/dpgen) (πŸ‘¨β€πŸ’» 69 Β· πŸ”€ 180 Β· πŸ“₯ 1.9K Β· πŸ“¦ 7 Β· πŸ“‹ 310 - 14% open Β· ⏱️ 23.11.2024):
git clone https://github.com/deepmodeling/dpgen
- [PyPi](https://pypi.org/project/dpgen) (πŸ“₯ 870 / month Β· πŸ“¦ 2 Β· ⏱️ 23.11.2024):
pip install dpgen
- [Conda](https://anaconda.org/deepmodeling/dpgen) (πŸ“₯ 220 Β· ⏱️ 16.06.2023):
conda install -c deepmodeling dpgen
NequIP (πŸ₯‡22 Β· ⭐ 660) - NequIP is a code for building E(3)-equivariant interatomic potentials. MIT - [GitHub](https://github.com/mir-group/nequip) (πŸ‘¨β€πŸ’» 12 Β· πŸ”€ 140 Β· πŸ“¦ 33 Β· πŸ“‹ 98 - 25% open Β· ⏱️ 14.11.2024):
git clone https://github.com/mir-group/nequip
- [PyPi](https://pypi.org/project/nequip) (πŸ“₯ 1.6K / month Β· πŸ“¦ 1 Β· ⏱️ 09.07.2024):
pip install nequip
- [Conda](https://anaconda.org/conda-forge/nequip) (πŸ“₯ 7.1K Β· ⏱️ 31.12.2024):
conda install -c conda-forge nequip
MACE (πŸ₯‡22 Β· ⭐ 580) - MACE - Fast and accurate machine learning interatomic potentials with higher order equivariant message passing. MIT - [GitHub](https://github.com/ACEsuit/mace) (πŸ‘¨β€πŸ’» 47 Β· πŸ”€ 210 Β· πŸ“‹ 320 - 21% open Β· ⏱️ 20.12.2024):
git clone https://github.com/ACEsuit/mace
GPUMD (πŸ₯‡22 Β· ⭐ 500) - GPUMD is a highly efficient general-purpose molecular dynamic (MD) package and enables machine-learned potentials.. GPL-3.0 MD C++ electrostatics - [GitHub](https://github.com/brucefan1983/GPUMD) (πŸ‘¨β€πŸ’» 42 Β· πŸ”€ 120 Β· πŸ“‹ 190 - 11% open Β· ⏱️ 02.01.2025):
git clone https://github.com/brucefan1983/GPUMD
TorchMD-NET (πŸ₯ˆ21 Β· ⭐ 350) - Training neural network potentials. MIT MD rep-learn transformer pretrained - [GitHub](https://github.com/torchmd/torchmd-net) (πŸ‘¨β€πŸ’» 16 Β· πŸ”€ 75 Β· πŸ“‹ 130 - 34% open Β· ⏱️ 03.12.2024):
git clone https://github.com/torchmd/torchmd-net
- [Conda](https://anaconda.org/conda-forge/torchmd-net) (πŸ“₯ 270K Β· ⏱️ 03.12.2024):
conda install -c conda-forge torchmd-net
apax (πŸ₯ˆ19 Β· ⭐ 19) - A flexible and performant framework for training machine learning potentials. MIT - [GitHub](https://github.com/apax-hub/apax) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 3 Β· πŸ“¦ 3 Β· πŸ“‹ 140 - 13% open Β· ⏱️ 17.12.2024):
git clone https://github.com/apax-hub/apax
- [PyPi](https://pypi.org/project/apax) (πŸ“₯ 600 / month Β· ⏱️ 03.12.2024):
pip install apax
Neural Force Field (πŸ₯ˆ16 Β· ⭐ 250) - Neural Network Force Field based on PyTorch. MIT pretrained - [GitHub](https://github.com/learningmatter-mit/NeuralForceField) (πŸ‘¨β€πŸ’» 42 Β· πŸ”€ 51 Β· πŸ“‹ 21 - 14% open Β· ⏱️ 06.12.2024):
git clone https://github.com/learningmatter-mit/NeuralForceField
n2p2 (πŸ₯ˆ16 Β· ⭐ 230) - n2p2 - A Neural Network Potential Package. GPL-3.0 C++ - [GitHub](https://github.com/CompPhysVienna/n2p2) (πŸ‘¨β€πŸ’» 11 Β· πŸ”€ 78 Β· πŸ“‹ 150 - 44% open Β· ⏱️ 24.11.2024):
git clone https://github.com/CompPhysVienna/n2p2
NNPOps (πŸ₯ˆ15 Β· ⭐ 88) - High-performance operations for neural network potentials. MIT MD C++ - [GitHub](https://github.com/openmm/NNPOps) (πŸ‘¨β€πŸ’» 9 Β· πŸ”€ 18 Β· πŸ“‹ 57 - 38% open Β· ⏱️ 10.07.2024):
git clone https://github.com/openmm/NNPOps
- [Conda](https://anaconda.org/conda-forge/nnpops) (πŸ“₯ 310K Β· ⏱️ 14.11.2024):
conda install -c conda-forge nnpops
PyXtalFF (πŸ₯ˆ15 Β· ⭐ 87 Β· πŸ’€) - Machine Learning Interatomic Potential Predictions. MIT - [GitHub](https://github.com/MaterSim/PyXtal_FF) (πŸ‘¨β€πŸ’» 9 Β· πŸ”€ 23 Β· πŸ“‹ 63 - 19% open Β· ⏱️ 07.01.2024):
git clone https://github.com/MaterSim/PyXtal_FF
- [PyPi](https://pypi.org/project/pyxtal_ff) (πŸ“₯ 210 / month Β· ⏱️ 21.12.2022):
pip install pyxtal_ff
KLIFF (πŸ₯ˆ15 Β· ⭐ 34) - KIM-based Learning-Integrated Fitting Framework for interatomic potentials. LGPL-2.1 probabilistic workflows - [GitHub](https://github.com/openkim/kliff) (πŸ‘¨β€πŸ’» 9 Β· πŸ”€ 19 Β· πŸ“¦ 4 Β· πŸ“‹ 42 - 54% open Β· ⏱️ 08.10.2024):
git clone https://github.com/openkim/kliff
- [PyPi](https://pypi.org/project/kliff) (πŸ“₯ 270 / month Β· ⏱️ 17.12.2023):
pip install kliff
- [Conda](https://anaconda.org/conda-forge/kliff) (πŸ“₯ 130K Β· ⏱️ 10.09.2024):
conda install -c conda-forge kliff
Ultra-Fast Force Fields (UF3) (πŸ₯ˆ14 Β· ⭐ 62) - UF3: a python library for generating ultra-fast interatomic potentials. Apache-2 - [GitHub](https://github.com/uf3/uf3) (πŸ‘¨β€πŸ’» 10 Β· πŸ”€ 22 Β· πŸ“¦ 2 Β· πŸ“‹ 50 - 38% open Β· ⏱️ 04.10.2024):
git clone https://github.com/uf3/uf3
- [PyPi](https://pypi.org/project/uf3) (πŸ“₯ 57 / month Β· ⏱️ 27.10.2023):
pip install uf3
MLIPX - Machine-Learned Interatomic Potential eXploration (πŸ₯ˆ14 Β· ⭐ 62 Β· 🐣) - Machine-Learned Interatomic Potential eXploration (mlipx) is designed at BASF for evaluating machine-learned.. MIT benchmarking viz workflows - [GitHub](https://github.com/basf/mlipx) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 4 Β· πŸ“‹ 4 - 50% open Β· ⏱️ 12.12.2024):
git clone https://github.com/basf/mlipx
- [PyPi](https://pypi.org/project/mlipx) (πŸ“₯ 860 / month Β· ⏱️ 12.12.2024):
pip install mlipx
wfl (πŸ₯ˆ14 Β· ⭐ 36) - Workflow is a Python toolkit for building interatomic potential creation and atomistic simulation workflows. GPL-2.0 workflows HTC - [GitHub](https://github.com/libAtoms/workflow) (πŸ‘¨β€πŸ’» 19 Β· πŸ”€ 19 Β· πŸ“¦ 2 Β· πŸ“‹ 160 - 41% open Β· ⏱️ 04.12.2024):
git clone https://github.com/libAtoms/workflow
PiNN (πŸ₯ˆ13 Β· ⭐ 110) - A Python library for building atomic neural networks. BSD-3 - [GitHub](https://github.com/Teoroo-CMC/PiNN) (πŸ‘¨β€πŸ’» 6 Β· πŸ”€ 33 Β· πŸ“‹ 7 - 14% open Β· ⏱️ 20.12.2024):
git clone https://github.com/Teoroo-CMC/PiNN
- [Docker Hub](https://hub.docker.com/r/teoroo/pinn) (πŸ“₯ 380 Β· ⏱️ 20.12.2024):
docker pull teoroo/pinn
So3krates (MLFF) (πŸ₯ˆ13 Β· ⭐ 100) - Build neural networks for machine learning force fields with JAX. MIT - [GitHub](https://github.com/thorben-frank/mlff) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 22 Β· πŸ“‹ 10 - 40% open Β· ⏱️ 23.08.2024):
git clone https://github.com/thorben-frank/mlff
ANI-1 (πŸ₯ˆ12 Β· ⭐ 220 Β· πŸ’€) - ANI-1 neural net potential with python interface (ASE). MIT - [GitHub](https://github.com/isayev/ASE_ANI) (πŸ‘¨β€πŸ’» 6 Β· πŸ”€ 54 Β· πŸ“‹ 37 - 43% open Β· ⏱️ 11.03.2024):
git clone https://github.com/isayev/ASE_ANI
DMFF (πŸ₯ˆ12 Β· ⭐ 160 Β· πŸ’€) - DMFF (Differentiable Molecular Force Field) is a Jax-based python package that provides a full differentiable.. LGPL-3.0 - [GitHub](https://github.com/deepmodeling/DMFF) (πŸ‘¨β€πŸ’» 14 Β· πŸ”€ 45 Β· πŸ“‹ 27 - 40% open Β· ⏱️ 12.01.2024):
git clone https://github.com/deepmodeling/DMFF
Pacemaker (πŸ₯ˆ12 Β· ⭐ 73) - Python package for fitting atomic cluster expansion (ACE) potentials. Custom - [GitHub](https://github.com/ICAMS/python-ace) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 19 Β· πŸ“‹ 58 - 34% open Β· ⏱️ 20.11.2024):
git clone https://github.com/ICAMS/python-ace
- [PyPi](https://pypi.org/project/python-ace) (πŸ“₯ 15 / month Β· ⏱️ 24.10.2022):
pip install python-ace
CCS_fit (πŸ₯ˆ12 Β· ⭐ 8 Β· πŸ’€) - Curvature Constrained Splines. GPL-3.0 - [GitHub](https://github.com/Teoroo-CMC/CCS) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 11 Β· πŸ“₯ 750 Β· πŸ“‹ 14 - 57% open Β· ⏱️ 16.02.2024):
git clone https://github.com/Teoroo-CMC/CCS
- [PyPi](https://pypi.org/project/ccs_fit) (πŸ“₯ 2.5K / month Β· ⏱️ 16.02.2024):
pip install ccs_fit
PyNEP (πŸ₯ˆ11 Β· ⭐ 50) - A python interface of the machine learning potential NEP used in GPUMD. MIT - [GitHub](https://github.com/bigd4/PyNEP) (πŸ‘¨β€πŸ’» 9 Β· πŸ”€ 16 Β· πŸ“‹ 11 - 36% open Β· ⏱️ 15.12.2024):
git clone https://github.com/bigd4/PyNEP
calorine (πŸ₯ˆ11 Β· ⭐ 14) - A Python package for constructing and sampling neuroevolution potential models. https://doi.org/10.21105/joss.06264. Custom - [PyPi](https://pypi.org/project/calorine) (πŸ“₯ 1.4K / month Β· πŸ“¦ 4 Β· ⏱️ 25.10.2024):
pip install calorine
- [GitLab](https://gitlab.com/materials-modeling/calorine) (πŸ”€ 4 Β· πŸ“‹ 91 - 5% open Β· ⏱️ 25.10.2024):
git clone https://gitlab.com/materials-modeling/calorine
Allegro (πŸ₯‰10 Β· ⭐ 370) - Allegro is an open-source code for building highly scalable and accurate equivariant deep learning interatomic.. MIT - [GitHub](https://github.com/mir-group/allegro) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 46 Β· πŸ“‹ 40 - 52% open Β· ⏱️ 14.11.2024):
git clone https://github.com/mir-group/allegro
ACE.jl (πŸ₯‰10 Β· ⭐ 65) - Parameterisation of Equivariant Properties of Particle Systems. Custom Julia - [GitHub](https://github.com/ACEsuit/ACE.jl) (πŸ‘¨β€πŸ’» 12 Β· πŸ”€ 15 Β· πŸ“‹ 82 - 29% open Β· ⏱️ 17.12.2024):
git clone https://github.com/ACEsuit/ACE.jl
Asparagus (πŸ₯‰10 Β· ⭐ 9 Β· 🐣) - Program Package for Sampling, Training and Applying ML-based Potential models https://doi.org/10.48550/arXiv.2407.15175. MIT workflows sampling MD - [GitHub](https://github.com/MMunibas/Asparagus) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 3 Β· ⏱️ 13.12.2024):
git clone https://github.com/MMunibas/Asparagus
tinker-hp (πŸ₯‰9 Β· ⭐ 82 Β· πŸ“‰) - Tinker-HP: High-Performance Massively Parallel Evolution of Tinker on CPUs & GPUs. Custom - [GitHub](https://github.com/TinkerTools/tinker-hp) (πŸ‘¨β€πŸ’» 12 Β· πŸ”€ 22 Β· πŸ“‹ 22 - 22% open Β· ⏱️ 26.10.2024):
git clone https://github.com/TinkerTools/tinker-hp
ACE1.jl (πŸ₯‰9 Β· ⭐ 21) - Atomic Cluster Expansion for Modelling Invariant Atomic Properties. Custom Julia - [GitHub](https://github.com/ACEsuit/ACE1.jl) (πŸ‘¨β€πŸ’» 9 Β· πŸ”€ 7 Β· πŸ“‹ 46 - 47% open Β· ⏱️ 11.09.2024):
git clone https://github.com/ACEsuit/ACE1.jl
Point Edge Transformer (PET) (πŸ₯‰9 Β· ⭐ 19) - Point Edge Transformer. MIT rep-learn transformer - [GitHub](https://github.com/spozdn/pet) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 5 Β· ⏱️ 02.07.2024):
git clone https://github.com/spozdn/pet
ACEfit (πŸ₯‰9 Β· ⭐ 7) - MIT Julia - [GitHub](https://github.com/ACEsuit/ACEfit.jl) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 7 Β· πŸ“‹ 57 - 38% open Β· ⏱️ 14.09.2024):
git clone https://github.com/ACEsuit/ACEfit.jl
GAP (πŸ₯‰8 Β· ⭐ 40) - Gaussian Approximation Potential (GAP). Custom - [GitHub](https://github.com/libAtoms/GAP) (πŸ‘¨β€πŸ’» 13 Β· πŸ”€ 20 Β· ⏱️ 17.08.2024):
git clone https://github.com/libAtoms/GAP
ALF (πŸ₯‰8 Β· ⭐ 31) - A framework for performing active learning for training machine-learned interatomic potentials. Custom active-learning - [GitHub](https://github.com/lanl/ALF) (πŸ‘¨β€πŸ’» 5 Β· πŸ”€ 12 Β· ⏱️ 04.11.2024):
git clone https://github.com/lanl/alf
TurboGAP (πŸ₯‰8 Β· ⭐ 16) - The TurboGAP code. Custom Fortran - [GitHub](https://github.com/mcaroba/turbogap) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 10 Β· πŸ“‹ 11 - 72% open Β· ⏱️ 17.12.2024):
git clone https://github.com/mcaroba/turbogap
MLXDM (πŸ₯‰6 Β· ⭐ 7) - A Neural Network Potential with Rigorous Treatment of Long-Range Dispersion https://doi.org/10.1039/D2DD00150K. MIT long-range - [GitHub](https://github.com/RowleyGroup/MLXDM) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 2 Β· ⏱️ 18.12.2024):
git clone https://github.com/RowleyGroup/MLXDM
TensorPotential (πŸ₯‰5 Β· ⭐ 10) - Tensorpotential is a TensorFlow based tool for development, fitting ML interatomic potentials from electronic.. Custom - [GitHub](https://github.com/ICAMS/TensorPotential) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 4 Β· ⏱️ 12.09.2024):
git clone https://github.com/ICAMS/TensorPotential
Show 35 hidden projects... - TorchANI (πŸ₯‡24 Β· ⭐ 480 Β· πŸ’€) - Accurate Neural Network Potential on PyTorch. MIT - MEGNet (πŸ₯‡23 Β· ⭐ 510 Β· πŸ’€) - Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals. BSD-3 multifidelity - sGDML (πŸ₯ˆ16 Β· ⭐ 140 Β· πŸ’€) - sGDML - Reference implementation of the Symmetric Gradient Domain Machine Learning model. MIT - TensorMol (πŸ₯ˆ12 Β· ⭐ 270 Β· πŸ’€) - Tensorflow + Molecules = TensorMol. GPL-3.0 single-paper - SIMPLE-NN (πŸ₯ˆ11 Β· ⭐ 47 Β· πŸ’€) - SIMPLE-NN(SNU Interatomic Machine-learning PotentiaL packagE version Neural Network). GPL-3.0 - NNsforMD (πŸ₯‰10 Β· ⭐ 10 Β· πŸ’€) - Neural network class for molecular dynamics to predict potential energy, forces and non-adiabatic couplings. MIT - DimeNet (πŸ₯‰9 Β· ⭐ 300 Β· πŸ’€) - DimeNet and DimeNet++ models, as proposed in Directional Message Passing for Molecular Graphs (ICLR 2020) and Fast and.. Custom - SchNet (πŸ₯‰9 Β· ⭐ 230 Β· πŸ’€) - SchNet - a deep learning architecture for quantum chemistry. MIT - GemNet (πŸ₯‰9 Β· ⭐ 190 Β· πŸ’€) - GemNet model in PyTorch, as proposed in GemNet: Universal Directional Graph Neural Networks for Molecules (NeurIPS.. Custom - AIMNet (πŸ₯‰8 Β· ⭐ 100 Β· πŸ’€) - Atoms In Molecules Neural Network Potential. MIT single-paper - MACE-Jax (πŸ₯‰8 Β· ⭐ 64 Β· πŸ’€) - Equivariant machine learning interatomic potentials in JAX. MIT - SIMPLE-NN v2 (πŸ₯‰8 Β· ⭐ 41 Β· πŸ’€) - SIMPLE-NN is an open package that constructs Behler-Parrinello-type neural-network interatomic potentials from ab.. GPL-3.0 - SNAP (πŸ₯‰8 Β· ⭐ 37 Β· πŸ’€) - Repository for spectral neighbor analysis potential (SNAP) model development. BSD-3 - Atomistic Adversarial Attacks (πŸ₯‰8 Β· ⭐ 34 Β· πŸ’€) - Code for performing adversarial attacks on atomistic systems using NN potentials. MIT probabilistic - MEGNetSparse (πŸ₯‰8 Β· ⭐ 2) - A library imlementing a graph neural network with sparse representation from Code for Kazeev, N., Al-Maeeni, A.R.,.. MIT material-defect - PhysNet (πŸ₯‰7 Β· ⭐ 94 Β· πŸ’€) - Code for training PhysNet models. MIT electrostatics - MLIP-3 (πŸ₯‰6 Β· ⭐ 26 Β· πŸ’€) - MLIP-3: Active learning on atomic environments with Moment Tensor Potentials (MTP). BSD-2 C++ - testing-framework (πŸ₯‰6 Β· ⭐ 11 Β· πŸ’€) - The purpose of this repository is to aid the testing of a large number of interatomic potentials for a variety of.. Unlicensed benchmarking - PANNA (πŸ₯‰6 Β· ⭐ 10 Β· πŸ’€) - A package to train and validate all-to-all connected network models for BP[1] and modified-BP[2] type local atomic.. MIT benchmarking - GN-MM (πŸ₯‰5 Β· ⭐ 10 Β· πŸ’€) - The Gaussian Moment Neural Network (GM-NN) package developed for large-scale atomistic simulations employing atomistic.. MIT active-learning MD rep-eng magnetism - Alchemical learning (πŸ₯‰5 Β· ⭐ 2 Β· πŸ’€) - Code for the Modeling high-entropy transition metal alloys with alchemical compression article. BSD-3 - ACE1Pack.jl (πŸ₯‰5 Β· ⭐ 1 Β· πŸ’€) - Provides convenience functionality for the usage of ACE1.jl, ACEfit.jl, JuLIP.jl for fitting interatomic potentials.. MIT Julia - NequIP-JAX (πŸ₯‰4 Β· ⭐ 20 Β· πŸ’€) - JAX implementation of the NequIP interatomic potential. Unlicensed - Allegro-Legato (πŸ₯‰4 Β· ⭐ 19 Β· πŸ’€) - An extension of Allegro with enhanced robustness and time-to-failure. MIT MD - glp (πŸ₯‰4 Β· ⭐ 18 Β· πŸ’€) - tools for graph-based machine-learning potentials in jax. MIT - ACE Workflows (πŸ₯‰4 Β· πŸ’€) - Workflow Examples for ACE Models. Unlicensed Julia workflows - PeriodicPotentials (πŸ₯‰4 Β· πŸ’€) - A Periodic table app that displays potentials based on the selected elements. MIT community-resource viz JavaScript - PyFLAME (πŸ₯‰3 Β· πŸ’€) - An automated approach for developing neural network interatomic potentials with FLAME.. Unlicensed active-learning structure-prediction structure-optimization rep-eng Fortran - SingleNN (πŸ₯‰2 Β· ⭐ 9 Β· πŸ’€) - An efficient package for training and executing neural-network interatomic potentials. Unlicensed C++ - AisNet (πŸ₯‰2 Β· ⭐ 3 Β· πŸ’€) - A Universal Interatomic Potential Neural Network with Encoded Local Environment Features.. MIT - RuNNer (πŸ₯‰2) - The RuNNer Neural Network Energy Representation is a Fortran-based framework for the construction of Behler-.. GPL-3.0 Fortran - Allegro-JAX (πŸ₯‰1 Β· ⭐ 21 Β· πŸ’€) - JAX implementation of the Allegro interatomic potential. Unlicensed - nnp-pre-training (πŸ₯‰1 Β· ⭐ 6 Β· πŸ’€) - Synthetic pre-training for neural-network interatomic potentials. Unlicensed pretrained MD - mag-ace (πŸ₯‰1 Β· ⭐ 2 Β· πŸ’€) - Magnetic ACE potential. FORTRAN interface for LAMMPS SPIN package. Unlicensed magnetism MD Fortran - mlp (πŸ₯‰1 Β· ⭐ 1 Β· πŸ’€) - Proper orthogonal descriptors for efficient and accurate interatomic potentials... Unlicensed Julia


Language Models

Back to top

Projects that use (large) language models (LMs, LLMs) or natural language procesing (NLP) techniques for atomistic ML.

paper-qa (πŸ₯‡30 Β· ⭐ 6.7K) - High accuracy RAG for answering questions from scientific documents with citations. Apache-2 ai-agent - [GitHub](https://github.com/Future-House/paper-qa) (πŸ‘¨β€πŸ’» 31 Β· πŸ”€ 640 Β· πŸ“¦ 89 Β· πŸ“‹ 290 - 43% open Β· ⏱️ 30.12.2024):
git clone https://github.com/whitead/paper-qa
- [PyPi](https://pypi.org/project/paper-qa) (πŸ“₯ 17K / month Β· πŸ“¦ 10 Β· ⏱️ 11.12.2024):
pip install paper-qa
ChemCrow (πŸ₯‡18 Β· ⭐ 660 Β· πŸ“ˆ) - Open source package for the accurate solution of reasoning-intensive chemical tasks. MIT ai-agent - [GitHub](https://github.com/ur-whitelab/chemcrow-public) (πŸ‘¨β€πŸ’» 3 Β· πŸ”€ 98 Β· πŸ“¦ 8 Β· πŸ“‹ 22 - 36% open Β· ⏱️ 19.12.2024):
git clone https://github.com/ur-whitelab/chemcrow-public
- [PyPi](https://pypi.org/project/chemcrow) (πŸ“₯ 1.2K / month Β· ⏱️ 27.03.2024):
pip install chemcrow
OpenBioML ChemNLP (πŸ₯‡18 Β· ⭐ 150) - ChemNLP project. MIT datasets - [GitHub](https://github.com/OpenBioML/chemnlp) (πŸ‘¨β€πŸ’» 27 Β· πŸ”€ 45 Β· πŸ“‹ 250 - 44% open Β· ⏱️ 19.08.2024):
git clone https://github.com/OpenBioML/chemnlp
- [PyPi](https://pypi.org/project/chemnlp) (πŸ“₯ 270 / month Β· πŸ“¦ 1 Β· ⏱️ 07.08.2023):
pip install chemnlp
NIST ChemNLP (πŸ₯ˆ12 Β· ⭐ 73) - ChemNLP: A Natural Language Processing based Library for Materials Chemistry Text Data. MIT literature-data - [GitHub](https://github.com/usnistgov/chemnlp) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 17 Β· πŸ“¦ 4 Β· ⏱️ 19.08.2024):
git clone https://github.com/usnistgov/chemnlp
- [PyPi](https://pypi.org/project/chemnlp) (πŸ“₯ 270 / month Β· πŸ“¦ 1 Β· ⏱️ 07.08.2023):
pip install chemnlp
ChatMOF (πŸ₯ˆ11 Β· ⭐ 67) - Predict and Inverse design for metal-organic framework with large-language models (llms). MIT generative - [GitHub](https://github.com/Yeonghun1675/ChatMOF) (πŸ‘¨β€πŸ’» 1 Β· πŸ”€ 12 Β· πŸ“¦ 3 Β· ⏱️ 01.07.2024):
git clone https://github.com/Yeonghun1675/ChatMOF
- [PyPi](https://pypi.org/project/chatmof) (πŸ“₯ 840 / month Β· ⏱️ 01.07.2024):
pip install chatmof
AtomGPT (πŸ₯ˆ11 Β· ⭐ 36) - AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design.. Custom generative pretrained transformer - [GitHub](https://github.com/usnistgov/atomgpt) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 6 Β· πŸ“¦ 2 Β· ⏱️ 12.12.2024):
git clone https://github.com/usnistgov/atomgpt
- [PyPi](https://pypi.org/project/atomgpt) (πŸ“₯ 180 / month Β· ⏱️ 22.09.2024):
pip install atomgpt
LLaMP (πŸ₯‰7 Β· ⭐ 71) - A web app and Python API for multi-modal RAG framework to ground LLMs on high-fidelity materials informatics. An.. BSD-3 materials-discovery cheminformatics generative MD multimodal language-models Python general-tool - [GitHub](https://github.com/chiang-yuan/llamp) (πŸ‘¨β€πŸ’» 6 Β· πŸ”€ 12 Β· πŸ“‹ 25 - 32% open Β· ⏱️ 14.10.2024):
git clone https://github.com/chiang-yuan/llamp
LLM-Prop (πŸ₯‰7 Β· ⭐ 30 Β· πŸ’€) - A repository for the LLM-Prop implementation. MIT - [GitHub](https://github.com/vertaix/LLM-Prop) (πŸ‘¨β€πŸ’» 6 Β· πŸ”€ 6 Β· πŸ“‹ 2 - 50% open Β· ⏱️ 26.04.2024):
git clone https://github.com/vertaix/LLM-Prop
crystal-text-llm (πŸ₯‰5 Β· ⭐ 90 Β· πŸ’€) - Large language models to generate stable crystals. CC-BY-NC-4.0 materials-discovery - [GitHub](https://github.com/facebookresearch/crystal-text-llm) (πŸ‘¨β€πŸ’» 3 Β· πŸ”€ 17 Β· πŸ“‹ 11 - 81% open Β· ⏱️ 18.06.2024):
git clone https://github.com/facebookresearch/crystal-text-llm
SciBot (πŸ₯‰5 Β· ⭐ 30) - SciBot is a simple demo of building a domain-specific chatbot for science. Unlicensed ai-agent - [GitHub](https://github.com/CFN-softbio/SciBot) (πŸ‘¨β€πŸ’» 1 Β· πŸ”€ 9 Β· πŸ“¦ 2 Β· ⏱️ 03.09.2024):
git clone https://github.com/CFN-softbio/SciBot
MAPI_LLM (πŸ₯‰5 Β· ⭐ 9 Β· πŸ’€) - A LLM application developed during the LLM March MADNESS Hackathon https://doi.org/10.1039/D3DD00113J. MIT ai-agent dataset - [GitHub](https://github.com/maykcaldas/MAPI_LLM) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 2 Β· ⏱️ 11.04.2024):
git clone https://github.com/maykcaldas/MAPI_LLM
Cephalo (πŸ₯‰5 Β· ⭐ 9) - Multimodal Vision-Language Models for Bio-Inspired Materials Analysis and Design. Apache-2 generative multimodal pretrained - [GitHub](https://github.com/lamm-mit/Cephalo) (πŸ”€ 1 Β· ⏱️ 23.07.2024):
git clone https://github.com/lamm-mit/Cephalo
Show 10 hidden projects... - ChemDataExtractor (πŸ₯ˆ16 Β· ⭐ 310 Β· πŸ’€) - Automatically extract chemical information from scientific documents. MIT literature-data - gptchem (πŸ₯ˆ13 Β· ⭐ 240 Β· πŸ’€) - Use GPT-3 to solve chemistry problems. MIT - mat2vec (πŸ₯ˆ12 Β· ⭐ 620 Β· πŸ’€) - Supplementary Materials for Tshitoyan et al. Unsupervised word embeddings capture latent knowledge from materials.. MIT rep-learn - nlcc (πŸ₯ˆ12 Β· ⭐ 44 Β· πŸ’€) - Natural language computational chemistry command line interface. MIT single-paper - MoLFormer (πŸ₯‰9 Β· ⭐ 280 Β· πŸ’€) - Repository for MolFormer. Apache-2 transformer pretrained drug-discovery - MolSkill (πŸ₯‰9 Β· ⭐ 100 Β· πŸ’€) - Extracting medicinal chemistry intuition via preference machine learning. MIT drug-discovery recommender - chemlift (πŸ₯‰7 Β· ⭐ 32 Β· πŸ’€) - Language-interfaced fine-tuning for chemistry. MIT - BERT-PSIE-TC (πŸ₯‰5 Β· ⭐ 12 Β· πŸ’€) - A dataset of Curie temperatures automatically extracted from scientific literature with the use of the BERT-PSIE.. MIT magnetism - CatBERTa (πŸ₯‰4 Β· ⭐ 22 Β· πŸ’€) - Large Language Model for Catalyst Property Prediction. Unlicensed transformer catalysis - ChemDataWriter (πŸ₯‰4 Β· ⭐ 14 Β· πŸ’€) - ChemDataWriter is a transformer-based library for automatically generating research books in the chemistry area. MIT literature-data


Materials Discovery

Back to top

Projects that implement materials discovery methods using atomistic ML.

πŸ”— MatterGen - A generative model for inorganic materials design https://doi.org/10.48550/arXiv.2312.03687. generative proprietary

BOSS (πŸ₯‡14 Β· ⭐ 21) - Bayesian Optimization Structure Search (BOSS). Apache-2 probabilistic - [PyPi](https://pypi.org/project/aalto-boss) (πŸ“₯ 1.7K / month Β· ⏱️ 13.11.2024):
pip install aalto-boss
- [GitLab](https://gitlab.com/cest-group/boss) (πŸ”€ 11 Β· πŸ“‹ 31 - 6% open Β· ⏱️ 13.11.2024):
git clone https://gitlab.com/cest-group/boss
aviary (πŸ₯‡13 Β· ⭐ 48) - The Wren sits on its Roost in the Aviary. MIT - [GitHub](https://github.com/CompRhys/aviary) (πŸ‘¨β€πŸ’» 5 Β· πŸ”€ 12 Β· πŸ“‹ 31 - 12% open Β· ⏱️ 15.12.2024):
git clone https://github.com/CompRhys/aviary
Materials Discovery: GNoME (πŸ₯ˆ10 Β· ⭐ 920) - Graph Networks for Materials Science (GNoME) and dataset of 381,000 novel stable materials. Apache-2 UIP datasets rep-learn proprietary - [GitHub](https://github.com/google-deepmind/materials_discovery) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 150 Β· πŸ“‹ 25 - 84% open Β· ⏱️ 09.12.2024):
git clone https://github.com/google-deepmind/materials_discovery
AGOX (πŸ₯ˆ9 Β· ⭐ 14) - AGOX is a package for global optimization of atomic system using e.g. the energy calculated from density functional.. GPL-3.0 structure-optimization - [PyPi](https://pypi.org/project/agox) (πŸ“₯ 240 / month Β· ⏱️ 23.10.2024):
pip install agox
- [GitLab](https://gitlab.com/agox/agox) (πŸ”€ 5 Β· πŸ“‹ 26 - 38% open Β· ⏱️ 23.10.2024):
git clone https://gitlab.com/agox/agox
CSPML (crystal structure prediction with machine learning-based element substitution) (πŸ₯ˆ6 Β· ⭐ 22) - Original implementation of CSPML. MIT structure-prediction - [GitHub](https://github.com/Minoru938/CSPML) (πŸ‘¨β€πŸ’» 1 Β· πŸ”€ 8 Β· πŸ“‹ 3 - 66% open Β· ⏱️ 22.12.2024):
git clone https://github.com/minoru938/cspml
Show 6 hidden projects... - Computational Autonomy for Materials Discovery (CAMD) (πŸ₯ˆ6 Β· ⭐ 1 Β· πŸ’€) - Agent-based sequential learning software for materials discovery. Apache-2 - MAGUS (πŸ₯‰4 Β· ⭐ 63 Β· πŸ’€) - Machine learning And Graph theory assisted Universal structure Searcher. Unlicensed structure-prediction active-learning - ML-atomate (πŸ₯‰4 Β· ⭐ 5 Β· πŸ’€) - Machine learning-assisted Atomate code for autonomous computational materials screening. GPL-3.0 active-learning workflows - closed-loop-acceleration-benchmarks (πŸ₯‰4 Β· πŸ’€) - Data and scripts in support of the publication By how much can closed-loop frameworks accelerate computational.. MIT materials-discovery active-learning single-paper - SPINNER (πŸ₯‰3 Β· ⭐ 12 Β· πŸ’€) - SPINNER (Structure Prediction of Inorganic crystals using Neural Network potentials with Evolutionary and Random.. GPL-3.0 C++ structure-prediction - sl_discovery (πŸ₯‰3 Β· ⭐ 5 Β· πŸ’€) - Data processing and models related to Quantifying the performance of machine learning models in materials discovery. Apache-2 materials-discovery single-paper


Mathematical tools

Back to top

Projects that implement mathematical objects used in atomistic machine learning.

KFAC-JAX (πŸ₯‡19 Β· ⭐ 260) - Second Order Optimization and Curvature Estimation with K-FAC in JAX. Apache-2 - [GitHub](https://github.com/google-deepmind/kfac-jax) (πŸ‘¨β€πŸ’» 17 Β· πŸ”€ 23 Β· πŸ“¦ 11 Β· πŸ“‹ 20 - 45% open Β· ⏱️ 19.12.2024):
git clone https://github.com/google-deepmind/kfac-jax
- [PyPi](https://pypi.org/project/kfac-jax) (πŸ“₯ 660 / month Β· πŸ“¦ 1 Β· ⏱️ 04.04.2024):
pip install kfac-jax
gpax (πŸ₯‡17 Β· ⭐ 220 Β· πŸ’€) - Gaussian Processes for Experimental Sciences. MIT probabilistic active-learning - [GitHub](https://github.com/ziatdinovmax/gpax) (πŸ‘¨β€πŸ’» 6 Β· πŸ”€ 26 Β· πŸ“¦ 3 Β· πŸ“‹ 40 - 20% open Β· ⏱️ 21.05.2024):
git clone https://github.com/ziatdinovmax/gpax
- [PyPi](https://pypi.org/project/gpax) (πŸ“₯ 520 / month Β· ⏱️ 20.03.2024):
pip install gpax
SpheriCart (πŸ₯ˆ16 Β· ⭐ 75) - Multi-language library for the calculation of spherical harmonics in Cartesian coordinates. MIT - [GitHub](https://github.com/lab-cosmo/sphericart) (πŸ‘¨β€πŸ’» 11 Β· πŸ”€ 12 Β· πŸ“₯ 100 Β· πŸ“¦ 5 Β· πŸ“‹ 41 - 56% open Β· ⏱️ 07.11.2024):
git clone https://github.com/lab-cosmo/sphericart
- [PyPi](https://pypi.org/project/sphericart) (πŸ“₯ 700 / month Β· ⏱️ 04.09.2024):
pip install sphericart
Polynomials4ML.jl (πŸ₯ˆ11 Β· ⭐ 12 Β· πŸ’€) - Polynomials for ML: fast evaluation, batching, differentiation. MIT Julia - [GitHub](https://github.com/ACEsuit/Polynomials4ML.jl) (πŸ‘¨β€πŸ’» 10 Β· πŸ”€ 5 Β· πŸ“‹ 51 - 33% open Β· ⏱️ 22.06.2024):
git clone https://github.com/ACEsuit/Polynomials4ML.jl
GElib (πŸ₯ˆ9 Β· ⭐ 21) - C++/CUDA library for SO(3) equivariant operations. MPL-2.0 C++ - [GitHub](https://github.com/risi-kondor/GElib) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 3 Β· πŸ“‹ 8 - 50% open Β· ⏱️ 27.07.2024):
git clone https://github.com/risi-kondor/GElib
COSMO Toolbox (πŸ₯‰6 Β· ⭐ 7 Β· πŸ’€) - Assorted libraries and utilities for atomistic simulation analysis. Unlicensed C++ - [GitHub](https://github.com/lab-cosmo/toolbox) (πŸ‘¨β€πŸ’» 9 Β· πŸ”€ 7 Β· ⏱️ 19.03.2024):
git clone https://github.com/lab-cosmo/toolbox
Show 5 hidden projects... - lie-nn (πŸ₯ˆ9 Β· ⭐ 27 Β· πŸ’€) - Tools for building equivariant polynomials on reductive Lie groups. MIT rep-learn - EquivariantOperators.jl (πŸ₯‰6 Β· ⭐ 19 Β· πŸ’€) - This package is deprecated. Functionalities are migrating to Porcupine.jl. MIT Julia - cnine (πŸ₯‰5 Β· ⭐ 4) - Cnine tensor library. Unlicensed C++ - torch_spex (πŸ₯‰3 Β· ⭐ 3 Β· πŸ’€) - Spherical expansions in PyTorch. Unlicensed - Wigner Kernels (πŸ₯‰1 Β· ⭐ 2 Β· πŸ’€) - Collection of programs to benchmark Wigner kernels. Unlicensed benchmarking


Molecular Dynamics

Back to top

Projects that simplify the integration of molecular dynamics and atomistic machine learning.

JAX-MD (πŸ₯‡25 Β· ⭐ 1.2K) - Differentiable, Hardware Accelerated, Molecular Dynamics. Apache-2 - [GitHub](https://github.com/jax-md/jax-md) (πŸ‘¨β€πŸ’» 36 Β· πŸ”€ 200 Β· πŸ“¦ 64 Β· πŸ“‹ 160 - 49% open Β· ⏱️ 26.11.2024):
git clone https://github.com/jax-md/jax-md
- [PyPi](https://pypi.org/project/jax-md) (πŸ“₯ 3.7K / month Β· πŸ“¦ 3 Β· ⏱️ 09.08.2023):
pip install jax-md
mlcolvar (πŸ₯ˆ19 Β· ⭐ 95) - A unified framework for machine learning collective variables for enhanced sampling simulations. MIT sampling - [GitHub](https://github.com/luigibonati/mlcolvar) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 26 Β· πŸ“¦ 3 Β· πŸ“‹ 74 - 17% open Β· ⏱️ 25.11.2024):
git clone https://github.com/luigibonati/mlcolvar
- [PyPi](https://pypi.org/project/mlcolvar) (πŸ“₯ 200 / month Β· ⏱️ 12.06.2024):
pip install mlcolvar
FitSNAP (πŸ₯ˆ18 Β· ⭐ 160) - Software for generating machine-learning interatomic potentials for LAMMPS. GPL-2.0 - [GitHub](https://github.com/FitSNAP/FitSNAP) (πŸ‘¨β€πŸ’» 24 Β· πŸ”€ 54 Β· πŸ“₯ 13 Β· πŸ“‹ 73 - 21% open Β· ⏱️ 02.12.2024):
git clone https://github.com/FitSNAP/FitSNAP
- [Conda](https://anaconda.org/conda-forge/fitsnap3) (πŸ“₯ 9.9K Β· ⏱️ 16.06.2023):
conda install -c conda-forge fitsnap3
openmm-torch (πŸ₯ˆ17 Β· ⭐ 190) - OpenMM plugin to define forces with neural networks. Custom ML-IAP C++ - [GitHub](https://github.com/openmm/openmm-torch) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 24 Β· πŸ“‹ 96 - 29% open Β· ⏱️ 11.11.2024):
git clone https://github.com/openmm/openmm-torch
- [Conda](https://anaconda.org/conda-forge/openmm-torch) (πŸ“₯ 590K Β· ⏱️ 12.11.2024):
conda install -c conda-forge openmm-torch
OpenMM-ML (πŸ₯‰12 Β· ⭐ 85) - High level API for using machine learning models in OpenMM simulations. MIT ML-IAP - [GitHub](https://github.com/openmm/openmm-ml) (πŸ‘¨β€πŸ’» 5 Β· πŸ”€ 20 Β· πŸ“‹ 55 - 36% open Β· ⏱️ 06.08.2024):
git clone https://github.com/openmm/openmm-ml
- [Conda](https://anaconda.org/conda-forge/openmm-ml) (πŸ“₯ 6.4K Β· ⏱️ 07.06.2024):
conda install -c conda-forge openmm-ml
pair_nequip (πŸ₯‰10 Β· ⭐ 41 Β· πŸ’€) - LAMMPS pair style for NequIP. MIT ML-IAP rep-learn - [GitHub](https://github.com/mir-group/pair_nequip) (πŸ‘¨β€πŸ’» 3 Β· πŸ”€ 13 Β· πŸ“‹ 31 - 35% open Β· ⏱️ 05.06.2024):
git clone https://github.com/mir-group/pair_nequip
PACE (πŸ₯‰10 Β· ⭐ 28) - The LAMMPS ML-IAP `pair_style pace`, aka Atomic Cluster Expansion (ACE), aka ML-PACE,.. Custom - [GitHub](https://github.com/ICAMS/lammps-user-pace) (πŸ‘¨β€πŸ’» 8 Β· πŸ”€ 12 Β· πŸ“‹ 8 - 25% open Β· ⏱️ 17.12.2024):
git clone https://github.com/ICAMS/lammps-user-pace
pair_allegro (πŸ₯‰7 Β· ⭐ 39 Β· πŸ’€) - LAMMPS pair style for Allegro deep learning interatomic potentials with parallelization support. MIT ML-IAP rep-learn - [GitHub](https://github.com/mir-group/pair_allegro) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 8 Β· πŸ“‹ 33 - 45% open Β· ⏱️ 05.06.2024):
git clone https://github.com/mir-group/pair_allegro
SOMD (πŸ₯‰6 Β· ⭐ 14) - Molecular dynamics package designed for the SIESTA DFT code. AGPL-3.0 ML-IAP active-learning - [GitHub](https://github.com/initqp/somd) (πŸ”€ 2 Β· ⏱️ 04.11.2024):
git clone https://github.com/initqp/somd
Show 1 hidden projects... - interface-lammps-mlip-3 (πŸ₯‰3 Β· ⭐ 5 Β· πŸ’€) - An interface between LAMMPS and MLIP (version 3). GPL-2.0


Reinforcement Learning

Back to top

Projects that focus on reinforcement learning for atomistic ML.

Show 2 hidden projects... - ReLeaSE (πŸ₯‡11 Β· ⭐ 350 Β· πŸ’€) - Deep Reinforcement Learning for de-novo Drug Design. MIT drug-discovery - CatGym (πŸ₯‰6 Β· ⭐ 11 Β· πŸ’€) - Surface segregation using Deep Reinforcement Learning. GPL


Representation Engineering

Back to top

Projects that offer implementations of representations aka descriptors, fingerprints of atomistic systems, and models built with them, aka feature engineering.

cdk (πŸ₯‡26 Β· ⭐ 500) - The Chemistry Development Kit. LGPL-2.1 cheminformatics Java - [GitHub](https://github.com/cdk/cdk) (πŸ‘¨β€πŸ’» 170 Β· πŸ”€ 160 Β· πŸ“₯ 24K Β· πŸ“‹ 300 - 10% open Β· ⏱️ 17.12.2024):
git clone https://github.com/cdk/cdk
- [Maven](https://search.maven.org/artifact/org.openscience.cdk/cdk-bundle) (πŸ“¦ 16 Β· ⏱️ 21.08.2023):
<dependency>
    <groupId>org.openscience.cdk</groupId>
    <artifactId>cdk-bundle</artifactId>
    <version>[VERSION]</version>
</dependency>
DScribe (πŸ₯‡25 Β· ⭐ 410 Β· πŸ’€) - DScribe is a python package for creating machine learning descriptors for atomistic systems. Apache-2 - [GitHub](https://github.com/SINGROUP/dscribe) (πŸ‘¨β€πŸ’» 18 Β· πŸ”€ 88 Β· πŸ“¦ 220 Β· πŸ“‹ 100 - 11% open Β· ⏱️ 28.05.2024):
git clone https://github.com/SINGROUP/dscribe
- [PyPi](https://pypi.org/project/dscribe) (πŸ“₯ 63K / month Β· πŸ“¦ 35 Β· ⏱️ 28.05.2024):
pip install dscribe
- [Conda](https://anaconda.org/conda-forge/dscribe) (πŸ“₯ 160K Β· ⏱️ 28.05.2024):
conda install -c conda-forge dscribe
MODNet (πŸ₯‡16 Β· ⭐ 82) - MODNet: a framework for machine learning materials properties. MIT pretrained small-data transfer-learning - [GitHub](https://github.com/ppdebreuck/modnet) (πŸ‘¨β€πŸ’» 11 Β· πŸ”€ 33 Β· πŸ“¦ 10 Β· πŸ“‹ 56 - 46% open Β· ⏱️ 28.11.2024):
git clone https://github.com/ppdebreuck/modnet
Rascaline (πŸ₯‡16 Β· ⭐ 49 Β· πŸ“ˆ) - Computing representations for atomistic machine learning. BSD-3 Rust C++ - [GitHub](https://github.com/metatensor/featomic) (πŸ‘¨β€πŸ’» 14 Β· πŸ”€ 14 Β· πŸ“₯ 22 Β· πŸ“‹ 71 - 46% open Β· ⏱️ 20.12.2024):
git clone https://github.com/Luthaf/rascaline
GlassPy (πŸ₯ˆ14 Β· ⭐ 29) - Python module for scientists working with glass materials. GPL-3.0 - [GitHub](https://github.com/drcassar/glasspy) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 7 Β· πŸ“¦ 7 Β· πŸ“‹ 15 - 46% open Β· ⏱️ 13.10.2024):
git clone https://github.com/drcassar/glasspy
- [PyPi](https://pypi.org/project/glasspy) (πŸ“₯ 720 / month Β· ⏱️ 05.09.2024):
pip install glasspy
SISSO (πŸ₯ˆ12 Β· ⭐ 260) - A data-driven method combining symbolic regression and compressed sensing for accurate & interpretable models. Apache-2 Fortran - [GitHub](https://github.com/rouyang2017/SISSO) (πŸ‘¨β€πŸ’» 3 Β· πŸ”€ 85 Β· πŸ“‹ 77 - 23% open Β· ⏱️ 20.09.2024):
git clone https://github.com/rouyang2017/SISSO
fplib (πŸ₯‰8 Β· ⭐ 7 Β· πŸ“‰) - libfp is a library for calculating crystalline fingerprints and measuring similarities of materials. MIT C-lang single-paper - [GitHub](https://github.com/Rutgers-ZRG/libfp) (πŸ”€ 1 Β· πŸ“¦ 1 Β· ⏱️ 15.10.2024):
git clone https://github.com/zhuligs/fplib
NICE (πŸ₯‰7 Β· ⭐ 12 Β· πŸ’€) - NICE (N-body Iteratively Contracted Equivariants) is a set of tools designed for the calculation of invariant and.. MIT - [GitHub](https://github.com/lab-cosmo/nice) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 3 Β· πŸ“‹ 3 - 66% open Β· ⏱️ 15.04.2024):
git clone https://github.com/lab-cosmo/nice
milad (πŸ₯‰6 Β· ⭐ 31) - Moment Invariants Local Atomic Descriptor. GPL-3.0 generative - [GitHub](https://github.com/muhrin/milad) (πŸ‘¨β€πŸ’» 1 Β· πŸ”€ 2 Β· πŸ“¦ 3 Β· ⏱️ 20.08.2024):
git clone https://github.com/muhrin/milad
SA-GPR (πŸ₯‰6 Β· ⭐ 19) - Public repository for symmetry-adapted Gaussian Process Regression (SA-GPR). LGPL-3.0 C-lang - [GitHub](https://github.com/dilkins/TENSOAP) (πŸ‘¨β€πŸ’» 5 Β· πŸ”€ 14 Β· πŸ“‹ 7 - 28% open Β· ⏱️ 23.07.2024):
git clone https://github.com/dilkins/TENSOAP
Show 15 hidden projects... - CatLearn (πŸ₯‡16 Β· ⭐ 100 Β· πŸ’€) - GPL-3.0 surface-science - Librascal (πŸ₯ˆ13 Β· ⭐ 80 Β· πŸ’€) - A scalable and versatile library to generate representations for atomic-scale learning. LGPL-2.1 - BenchML (πŸ₯ˆ12 Β· ⭐ 15 Β· πŸ’€) - ML benchmarking and pipeling framework. Apache-2 benchmarking - cmlkit (πŸ₯ˆ11 Β· ⭐ 34 Β· πŸ’€) - tools for machine learning in condensed matter physics and quantum chemistry. MIT benchmarking - CBFV (πŸ₯ˆ11 Β· ⭐ 27 Β· πŸ’€) - Tool to quickly create a composition-based feature vector. Unlicensed - SkipAtom (πŸ₯‰10 Β· ⭐ 24 Β· πŸ’€) - Distributed representations of atoms, inspired by the Skip-gram model. MIT - SOAPxx (πŸ₯‰6 Β· ⭐ 7 Β· πŸ’€) - A SOAP implementation. GPL-2.0 C++ - pyLODE (πŸ₯‰6 Β· ⭐ 3 Β· πŸ’€) - Pythonic implementation of LOng Distance Equivariants. Apache-2 electrostatics - AMP (πŸ₯‰6 Β· πŸ’€) - Amp is an open-source package designed to easily bring machine-learning to atomistic calculations. Unlicensed - MXenes4HER (πŸ₯‰5 Β· ⭐ 6 Β· πŸ’€) - Predicting hydrogen evolution (HER) activity over 4500 MXene materials https://doi.org/10.1039/D3TA00344B. GPL-3.0 materials-discovery catalysis scikit-learn single-paper - soap_turbo (πŸ₯‰5 Β· ⭐ 5 Β· πŸ’€) - soap_turbo comprises a series of libraries to be used in combination with QUIP/GAP and TurboGAP. Custom Fortran - SISSO++ (πŸ₯‰5 Β· ⭐ 3 Β· πŸ’€) - C++ Implementation of SISSO with python bindings. Apache-2 C++ - automl-materials (πŸ₯‰4 Β· ⭐ 5 Β· πŸ’€) - AutoML for Regression Tasks on Small Tabular Data in Materials Design. MIT autoML benchmarking single-paper - magnetism-prediction (πŸ₯‰4 Β· ⭐ 1 Β· πŸ’€) - DFT-aided Machine Learning Search for Magnetism in Fe-based Bimetallic Chalcogenides. Apache-2 magnetism single-paper - ML-for-CurieTemp-Predictions (πŸ₯‰3 Β· ⭐ 1 Β· πŸ’€) - Machine Learning Predictions of High-Curie-Temperature Materials. MIT single-paper magnetism


Representation Learning

Back to top

General models that learn a representations aka embeddings of atomistic systems, such as message-passing neural networks (MPNN).

PyG Models (πŸ₯‡35 Β· ⭐ 22K) - Representation learning models implemented in PyTorch Geometric. MIT general-ml - [GitHub](https://github.com/pyg-team/pytorch_geometric) (πŸ‘¨β€πŸ’» 530 Β· πŸ”€ 3.7K Β· πŸ“¦ 7.4K Β· πŸ“‹ 3.8K - 29% open Β· ⏱️ 30.12.2024):
git clone https://github.com/pyg-team/pytorch_geometric
Deep Graph Library (DGL) (πŸ₯‡35 Β· ⭐ 14K) - Python package built to ease deep learning on graph, on top of existing DL frameworks. Apache-2 - [GitHub](https://github.com/dmlc/dgl) (πŸ‘¨β€πŸ’» 300 Β· πŸ”€ 3K Β· πŸ“¦ 330 Β· πŸ“‹ 2.9K - 18% open Β· ⏱️ 18.10.2024):
git clone https://github.com/dmlc/dgl
- [PyPi](https://pypi.org/project/dgl) (πŸ“₯ 95K / month Β· πŸ“¦ 150 Β· ⏱️ 13.05.2024):
pip install dgl
- [Conda](https://anaconda.org/dglteam/dgl) (πŸ“₯ 400K Β· ⏱️ 03.09.2024):
conda install -c dglteam dgl
e3nn (πŸ₯‡28 Β· ⭐ 1K) - A modular framework for neural networks with Euclidean symmetry. MIT - [GitHub](https://github.com/e3nn/e3nn) (πŸ‘¨β€πŸ’» 34 Β· πŸ”€ 140 Β· πŸ“¦ 370 Β· πŸ“‹ 160 - 14% open Β· ⏱️ 23.12.2024):
git clone https://github.com/e3nn/e3nn
- [PyPi](https://pypi.org/project/e3nn) (πŸ“₯ 170K / month Β· πŸ“¦ 34 Β· ⏱️ 06.11.2024):
pip install e3nn
- [Conda](https://anaconda.org/conda-forge/e3nn) (πŸ“₯ 28K Β· ⏱️ 21.12.2024):
conda install -c conda-forge e3nn
SchNetPack (πŸ₯‡26 Β· ⭐ 800) - SchNetPack - Deep Neural Networks for Atomistic Systems. MIT - [GitHub](https://github.com/atomistic-machine-learning/schnetpack) (πŸ‘¨β€πŸ’» 36 Β· πŸ”€ 210 Β· πŸ“¦ 96 Β· πŸ“‹ 260 - 2% open Β· ⏱️ 26.11.2024):
git clone https://github.com/atomistic-machine-learning/schnetpack
- [PyPi](https://pypi.org/project/schnetpack) (πŸ“₯ 830 / month Β· πŸ“¦ 4 Β· ⏱️ 05.09.2024):
pip install schnetpack
MatGL (Materials Graph Library) (πŸ₯‡24 Β· ⭐ 300) - Graph deep learning library for materials. BSD-3 multifidelity - [GitHub](https://github.com/materialsvirtuallab/matgl) (πŸ‘¨β€πŸ’» 17 Β· πŸ”€ 68 Β· πŸ“¦ 59 Β· πŸ“‹ 110 - 6% open Β· ⏱️ 31.12.2024):
git clone https://github.com/materialsvirtuallab/matgl
- [PyPi](https://pypi.org/project/m3gnet) (πŸ“₯ 880 / month Β· πŸ“¦ 5 Β· ⏱️ 17.11.2022):
pip install m3gnet
e3nn-jax (πŸ₯ˆ22 Β· ⭐ 190) - jax library for E3 Equivariant Neural Networks. Apache-2 - [GitHub](https://github.com/e3nn/e3nn-jax) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 18 Β· πŸ“¦ 46 Β· πŸ“‹ 23 - 4% open Β· ⏱️ 15.12.2024):
git clone https://github.com/e3nn/e3nn-jax
- [PyPi](https://pypi.org/project/e3nn-jax) (πŸ“₯ 2.9K / month Β· πŸ“¦ 13 Β· ⏱️ 14.08.2024):
pip install e3nn-jax
ALIGNN (πŸ₯ˆ21 Β· ⭐ 240) - Atomistic Line Graph Neural Network https://scholar.google.com/citations?user=9Q-tNnwAAAAJ&hl=en.. Custom - [GitHub](https://github.com/usnistgov/alignn) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 86 Β· πŸ“¦ 17 Β· πŸ“‹ 70 - 61% open Β· ⏱️ 02.12.2024):
git clone https://github.com/usnistgov/alignn
- [PyPi](https://pypi.org/project/alignn) (πŸ“₯ 6.1K / month Β· πŸ“¦ 8 Β· ⏱️ 02.12.2024):
pip install alignn
NVIDIA Deep Learning Examples for Tensor Cores (πŸ₯ˆ20 Β· ⭐ 14K Β· πŸ’€) - State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and.. Custom educational drug-discovery - [GitHub](https://github.com/NVIDIA/DeepLearningExamples) (πŸ‘¨β€πŸ’» 120 Β· πŸ”€ 3.2K Β· πŸ“‹ 910 - 37% open Β· ⏱️ 04.04.2024):
git clone https://github.com/NVIDIA/DeepLearningExamples
DIG: Dive into Graphs (πŸ₯ˆ20 Β· ⭐ 1.9K Β· πŸ’€) - A library for graph deep learning research. GPL-3.0 - [GitHub](https://github.com/divelab/DIG) (πŸ‘¨β€πŸ’» 50 Β· πŸ”€ 280 Β· πŸ“‹ 210 - 16% open Β· ⏱️ 04.02.2024):
git clone https://github.com/divelab/DIG
- [PyPi](https://pypi.org/project/dive-into-graphs) (πŸ“₯ 840 / month Β· ⏱️ 27.06.2022):
pip install dive-into-graphs
matsciml (πŸ₯ˆ19 Β· ⭐ 160) - Open MatSci ML Toolkit is a framework for prototyping and scaling out deep learning models for materials discovery.. MIT workflows benchmarking - [GitHub](https://github.com/IntelLabs/matsciml) (πŸ‘¨β€πŸ’» 12 Β· πŸ”€ 23 Β· πŸ“‹ 66 - 34% open Β· ⏱️ 20.12.2024):
git clone https://github.com/IntelLabs/matsciml
Uni-Mol (πŸ₯ˆ18 Β· ⭐ 760) - Official Repository for the Uni-Mol Series Methods. MIT pretrained - [GitHub](https://github.com/deepmodeling/Uni-Mol) (πŸ‘¨β€πŸ’» 19 Β· πŸ”€ 130 Β· πŸ“₯ 17K Β· πŸ“‹ 180 - 44% open Β· ⏱️ 02.01.2025):
git clone https://github.com/deepmodeling/Uni-Mol
kgcnn (πŸ₯ˆ18 Β· ⭐ 110 Β· πŸ’€) - Graph convolutions in Keras with TensorFlow, PyTorch or Jax. MIT - [GitHub](https://github.com/aimat-lab/gcnn_keras) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 31 Β· πŸ“¦ 19 Β· πŸ“‹ 87 - 14% open Β· ⏱️ 06.05.2024):
git clone https://github.com/aimat-lab/gcnn_keras
- [PyPi](https://pypi.org/project/kgcnn) (πŸ“₯ 630 / month Β· πŸ“¦ 3 Β· ⏱️ 27.02.2024):
pip install kgcnn
escnn (πŸ₯ˆ16 Β· ⭐ 380) - Equivariant Steerable CNNs Library for Pytorch https://quva-lab.github.io/escnn/. Custom - [GitHub](https://github.com/QUVA-Lab/escnn) (πŸ‘¨β€πŸ’» 10 Β· πŸ”€ 47 Β· πŸ“‹ 75 - 50% open Β· ⏱️ 31.10.2024):
git clone https://github.com/QUVA-Lab/escnn
- [PyPi](https://pypi.org/project/escnn) (πŸ“₯ 1.1K / month Β· πŸ“¦ 6 Β· ⏱️ 01.04.2022):
pip install escnn
Graphormer (πŸ₯ˆ15 Β· ⭐ 2.2K Β· πŸ’€) - Graphormer is a general-purpose deep learning backbone for molecular modeling. MIT transformer pretrained - [GitHub](https://github.com/microsoft/Graphormer) (πŸ‘¨β€πŸ’» 14 Β· πŸ”€ 330 Β· πŸ“‹ 160 - 57% open Β· ⏱️ 28.05.2024):
git clone https://github.com/microsoft/Graphormer
HydraGNN (πŸ₯ˆ14 Β· ⭐ 68) - Distributed PyTorch implementation of multi-headed graph convolutional neural networks. BSD-3 - [GitHub](https://github.com/ORNL/HydraGNN) (πŸ‘¨β€πŸ’» 15 Β· πŸ”€ 28 Β· πŸ“¦ 2 Β· πŸ“‹ 49 - 34% open Β· ⏱️ 31.12.2024):
git clone https://github.com/ORNL/HydraGNN
Compositionally-Restricted Attention-Based Network (CrabNet) (πŸ₯ˆ13 Β· ⭐ 15) - Predict materials properties using only the composition information!. MIT - [GitHub](https://github.com/sparks-baird/CrabNet) (πŸ‘¨β€πŸ’» 6 Β· πŸ”€ 5 Β· πŸ“¦ 14 Β· πŸ“‹ 19 - 84% open Β· ⏱️ 09.09.2024):
git clone https://github.com/sparks-baird/CrabNet
- [PyPi](https://pypi.org/project/crabnet) (πŸ“₯ 1.1K / month Β· πŸ“¦ 2 Β· ⏱️ 10.01.2023):
pip install crabnet
hippynn (πŸ₯ˆ12 Β· ⭐ 72) - python library for atomistic machine learning. Custom workflows - [GitHub](https://github.com/lanl/hippynn) (πŸ‘¨β€πŸ’» 14 Β· πŸ”€ 23 Β· πŸ“¦ 2 Β· πŸ“‹ 22 - 45% open Β· ⏱️ 31.10.2024):
git clone https://github.com/lanl/hippynn
Atom2Vec (πŸ₯ˆ10 Β· ⭐ 36 Β· πŸ’€) - Atom2Vec: a simple way to describe atoms for machine learning. MIT - [GitHub](https://github.com/idocx/Atom2Vec) (πŸ‘¨β€πŸ’» 1 Β· πŸ”€ 9 Β· πŸ“¦ 3 Β· πŸ“‹ 4 - 75% open Β· ⏱️ 23.02.2024):
git clone https://github.com/idocx/Atom2Vec
- [PyPi](https://pypi.org/project/atom2vec) (πŸ“₯ 120 / month Β· ⏱️ 23.02.2024):
pip install atom2vec
GATGNN: Global Attention Graph Neural Network (πŸ₯‰9 Β· ⭐ 72) - Pytorch Repository for our work: Graph convolutional neural networks with global attention for improved materials.. MIT - [GitHub](https://github.com/superlouis/GATGNN) (πŸ‘¨β€πŸ’» 4 Β· πŸ”€ 16 Β· πŸ“‹ 7 - 57% open Β· ⏱️ 17.12.2024):
git clone https://github.com/superlouis/GATGNN
EquiformerV2 (πŸ₯‰8 Β· ⭐ 230) - [ICLR 2024] EquiformerV2: Improved Equivariant Transformer for Scaling to Higher-Degree Representations. MIT - [GitHub](https://github.com/atomicarchitects/equiformer_v2) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 32 Β· πŸ“‹ 19 - 68% open Β· ⏱️ 16.07.2024):
git clone https://github.com/atomicarchitects/equiformer_v2
Equiformer (πŸ₯‰8 Β· ⭐ 220) - [ICLR 2023 Spotlight] Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs. MIT transformer - [GitHub](https://github.com/atomicarchitects/equiformer) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 40 Β· πŸ“‹ 18 - 50% open Β· ⏱️ 18.07.2024):
git clone https://github.com/atomicarchitects/equiformer
graphite (πŸ₯‰8 Β· ⭐ 66) - A repository for implementing graph network models based on atomic structures. MIT - [GitHub](https://github.com/LLNL/graphite) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 9 Β· πŸ“¦ 15 Β· πŸ“‹ 4 - 75% open Β· ⏱️ 08.08.2024):
git clone https://github.com/llnl/graphite
DeeperGATGNN (πŸ₯‰8 Β· ⭐ 49 Β· πŸ’€) - Scalable graph neural networks for materials property prediction. MIT - [GitHub](https://github.com/usccolumbia/deeperGATGNN) (πŸ‘¨β€πŸ’» 3 Β· πŸ”€ 7 Β· πŸ“‹ 12 - 33% open Β· ⏱️ 19.01.2024):
git clone https://github.com/usccolumbia/deeperGATGNN
T-e3nn (πŸ₯‰8 Β· ⭐ 12) - Time-reversal Euclidean neural networks based on e3nn. MIT magnetism - [GitHub](https://github.com/Hongyu-yu/T-e3nn) (πŸ‘¨β€πŸ’» 26 Β· πŸ”€ 1 Β· ⏱️ 29.09.2024):
git clone https://github.com/Hongyu-yu/T-e3nn
Show 34 hidden projects... - dgl-lifesci (πŸ₯‡24 Β· ⭐ 740 Β· πŸ’€) - Python package for graph neural networks in chemistry and biology. Apache-2 - benchmarking-gnns (πŸ₯ˆ14 Β· ⭐ 2.5K Β· πŸ’€) - Repository for benchmarking graph neural networks. MIT single-paper benchmarking - Crystal Graph Convolutional Neural Networks (CGCNN) (πŸ₯ˆ13 Β· ⭐ 670 Β· πŸ’€) - Crystal graph convolutional neural networks for predicting material properties. MIT - Neural fingerprint (nfp) (πŸ₯ˆ12 Β· ⭐ 57 Β· πŸ’€) - Keras layers for end-to-end learning with rdkit and pymatgen. Custom - FAENet (πŸ₯ˆ11 Β· ⭐ 33 Β· πŸ’€) - Frame Averaging Equivariant GNN for materials modeling. MIT - pretrained-gnns (πŸ₯ˆ10 Β· ⭐ 980 Β· πŸ’€) - Strategies for Pre-training Graph Neural Networks. MIT pretrained - GDC (πŸ₯ˆ10 Β· ⭐ 270 Β· πŸ’€) - Graph Diffusion Convolution, as proposed in Diffusion Improves Graph Learning (NeurIPS 2019). MIT generative - SE(3)-Transformers (πŸ₯‰9 Β· ⭐ 500 Β· πŸ’€) - code for the SE3 Transformers paper: https://arxiv.org/abs/2006.10503. MIT single-paper transformer - ai4material_design (πŸ₯‰9 Β· ⭐ 6 Β· πŸ’€) - Code for Kazeev, N., Al-Maeeni, A.R., Romanov, I. et al. Sparse representation for machine learning the properties of.. Apache-2 pretrained material-defect - molecularGNN_smiles (πŸ₯‰8 Β· ⭐ 300 Β· πŸ’€) - The code of a graph neural network (GNN) for molecules, which is based on learning representations of r-radius.. Apache-2 - CGAT (πŸ₯‰8 Β· ⭐ 27 Β· πŸ’€) - Crystal graph attention neural networks for materials prediction. MIT - UVVisML (πŸ₯‰8 Β· ⭐ 26 Β· πŸ’€) - Predict optical properties of molecules with machine learning. MIT optical-properties single-paper probabilistic - tensorfieldnetworks (πŸ₯‰7 Β· ⭐ 150 Β· πŸ’€) - Rotation- and translation-equivariant neural networks for 3D point clouds. MIT - DTNN (πŸ₯‰7 Β· ⭐ 78 Β· πŸ’€) - Deep Tensor Neural Network. MIT - Cormorant (πŸ₯‰7 Β· ⭐ 60 Β· πŸ’€) - Codebase for Cormorant Neural Networks. Custom - AdsorbML (πŸ₯‰7 Β· ⭐ 39 Β· πŸ’€) - MIT surface-science single-paper - escnn_jax (πŸ₯‰7 Β· ⭐ 29 Β· πŸ’€) - Equivariant Steerable CNNs Library for Pytorch https://quva-lab.github.io/escnn/. Custom - ML4pXRDs (πŸ₯‰7 Β· πŸ’€) - Contains code to train neural networks based on simulated powder XRDs from synthetic crystals. MIT XRD single-paper - MACE-Layer (πŸ₯‰6 Β· ⭐ 33 Β· πŸ’€) - Higher order equivariant graph neural networks for 3D point clouds. MIT - charge_transfer_nnp (πŸ₯‰6 Β· ⭐ 33 Β· πŸ’€) - Graph neural network potential with charge transfer. MIT electrostatics - GLAMOUR (πŸ₯‰6 Β· ⭐ 21 Β· πŸ’€) - Graph Learning over Macromolecule Representations. MIT single-paper - Autobahn (πŸ₯‰5 Β· ⭐ 29 Β· πŸ’€) - Repository for Autobahn: Automorphism Based Graph Neural Networks. MIT - FieldSchNet (πŸ₯‰5 Β· ⭐ 19 Β· πŸ’€) - Deep neural network for molecules in external fields. MIT - SCFNN (πŸ₯‰5 Β· ⭐ 14 Β· πŸ’€) - Self-consistent determination of long-range electrostatics in neural network potentials. MIT C++ electrostatics single-paper - CraTENet (πŸ₯‰5 Β· ⭐ 14 Β· πŸ’€) - An attention-based deep neural network for thermoelectric transport properties. MIT transport-phenomena - EGraFFBench (πŸ₯‰5 Β· ⭐ 10 Β· πŸ’€) - Unlicensed single-paper benchmarking ML-IAP - Per-Site CGCNN (πŸ₯‰5 Β· ⭐ 1 Β· πŸ’€) - Crystal graph convolutional neural networks for predicting material properties. MIT pretrained single-paper - Per-site PAiNN (πŸ₯‰5 Β· ⭐ 1 Β· πŸ’€) - Fork of PaiNN for PerovskiteOrderingGCNNs. MIT probabilistic pretrained single-paper - Graph Transport Network (πŸ₯‰4 Β· ⭐ 16 Β· πŸ’€) - Graph transport network (GTN), as proposed in Scalable Optimal Transport in High Dimensions for Graph Distances,.. Custom transport-phenomena - gkx: Green-Kubo Method in JAX (πŸ₯‰4 Β· ⭐ 5 Β· πŸ’€) - Green-Kubo + JAX + MLPs = Anharmonic Thermal Conductivities Done Fast. MIT transport-phenomena - atom_by_atom (πŸ₯‰3 Β· ⭐ 9 Β· πŸ’€) - Atom-by-atom design of metal oxide catalysts for the oxygen evolution reaction with Machine Learning. Unlicensed surface-science single-paper - Element encoder (πŸ₯‰3 Β· ⭐ 6 Β· πŸ’€) - Autoencoder neural network to compress properties of atomic species into a vector representation. GPL-3.0 single-paper - Point Edge Transformer (πŸ₯‰2) - Smooth, exact rotational symmetrization for deep learning on point clouds. CC-BY-4.0 - SphericalNet (πŸ₯‰1 Β· ⭐ 3 Β· πŸ’€) - Implementation of Clebsch-Gordan Networks (CGnet: https://arxiv.org/pdf/1806.09231.pdf) by GElib & cnine libraries in.. Unlicensed


Universal Potentials

Back to top

Machine-learned interatomic potentials (ML-IAP) that have been trained on large, chemically and structural diverse datasets. For materials, this means e.g. datasets that include a majority of the periodic table.

πŸ”— TeaNet - Universal neural network interatomic potential inspired by iterative electronic relaxations.. ML-IAP

πŸ”— PreFerred Potential (PFP) - Universal neural network potential for material discovery https://doi.org/10.1038/s41467-022-30687-9. ML-IAP proprietary

πŸ”— MatterSim - A Deep Learning Atomistic Model Across Elements, Temperatures and Pressures https://doi.org/10.48550/arXiv.2405.04967. ML-IAP active-learning proprietary

DPA-2 (πŸ₯‡27 Β· ⭐ 1.5K) - Towards a universal large atomic model for molecular and material simulation https://doi.org/10.48550/arXiv.2312.15492. LGPL-3.0 ML-IAP pretrained workflows datasets - [GitHub](https://github.com/deepmodeling/deepmd-kit) (πŸ‘¨β€πŸ’» 73 Β· πŸ”€ 520 Β· πŸ“₯ 46K Β· πŸ“¦ 22 Β· πŸ“‹ 870 - 10% open Β· ⏱️ 23.12.2024):
git clone https://github.com/deepmodeling/deepmd-kit
CHGNet (πŸ₯ˆ22 Β· ⭐ 260) - Pretrained universal neural network potential for charge-informed atomistic modeling https://chgnet.lbl.gov. Custom ML-IAP MD pretrained electrostatics magnetism structure-relaxation - [GitHub](https://github.com/CederGroupHub/chgnet) (πŸ‘¨β€πŸ’» 10 Β· πŸ”€ 68 Β· πŸ“¦ 43 Β· πŸ“‹ 62 - 4% open Β· ⏱️ 16.11.2024):
git clone https://github.com/CederGroupHub/chgnet
- [PyPi](https://pypi.org/project/chgnet) (πŸ“₯ 24K / month Β· πŸ“¦ 21 Β· ⏱️ 16.09.2024):
pip install chgnet
MACE-MP (πŸ₯ˆ18 Β· ⭐ 560) - Pretrained foundation models for materials chemistry. MIT ML-IAP pretrained rep-learn MD - [GitHub](https://github.com/ACEsuit/mace-mp) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 210 Β· πŸ“₯ 46K Β· πŸ“‹ 10 - 10% open Β· ⏱️ 15.11.2024):
git clone https://github.com/ACEsuit/mace-mp
- [PyPi](https://pypi.org/project/mace-torch) (πŸ“₯ 8.9K / month Β· πŸ“¦ 23 Β· ⏱️ 07.12.2024):
pip install mace-torch
M3GNet (πŸ₯ˆ18 Β· ⭐ 260) - Materials graph network with 3-body interactions featuring a DFT surrogate crystal relaxer and a state-of-the-art.. BSD-3 ML-IAP pretrained - [GitHub](https://github.com/materialsvirtuallab/m3gnet) (πŸ‘¨β€πŸ’» 16 Β· πŸ”€ 66 Β· πŸ“¦ 30 Β· πŸ“‹ 35 - 42% open Β· ⏱️ 04.10.2024):
git clone https://github.com/materialsvirtuallab/m3gnet
- [PyPi](https://pypi.org/project/m3gnet) (πŸ“₯ 880 / month Β· πŸ“¦ 5 Β· ⏱️ 17.11.2022):
pip install m3gnet
Orb Models (πŸ₯ˆ18 Β· ⭐ 220 Β· 🐣) - ORB forcefield models from Orbital Materials. Custom ML-IAP pretrained - [GitHub](https://github.com/orbital-materials/orb-models) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 23 Β· πŸ“¦ 6 Β· πŸ“‹ 19 - 10% open Β· ⏱️ 19.12.2024):
git clone https://github.com/orbital-materials/orb-models
- [PyPi](https://pypi.org/project/orb-models) (πŸ“₯ 1.9K / month Β· πŸ“¦ 4 Β· ⏱️ 20.12.2024):
pip install orb-models
SevenNet (πŸ₯‰17 Β· ⭐ 140) - SevenNet (Scalable EquiVariance Enabled Neural Network) is a graph neural network interatomic potential package that.. GPL-3.0 ML-IAP MD pretrained - [GitHub](https://github.com/MDIL-SNU/SevenNet) (πŸ‘¨β€πŸ’» 14 Β· πŸ”€ 21 Β· πŸ“¦ 8 Β· πŸ“‹ 33 - 30% open Β· ⏱️ 19.12.2024):
git clone https://github.com/MDIL-SNU/SevenNet
MLIP Arena Leaderboard (πŸ₯‰13 Β· ⭐ 53) - Fair and transparent benchmark of machine-learned interatomic potentials (MLIPs), beyond basic error metrics. Apache-2 ML-IAP community-resource - [GitHub](https://github.com/atomind-ai/mlip-arena) (πŸ‘¨β€πŸ’» 3 Β· πŸ”€ 2 Β· πŸ“¦ 2 Β· πŸ“‹ 11 - 63% open Β· ⏱️ 25.12.2024):
git clone https://github.com/atomind-ai/mlip-arena
GRACE (πŸ₯‰10 Β· ⭐ 27 Β· 🐣) - GRACE models and gracemaker (as implemented in TensorPotential package). Custom ML-IAP pretrained MD rep-learn rep-eng - [GitHub](https://github.com/ICAMS/grace-tensorpotential) (πŸ‘¨β€πŸ’» 3 Β· πŸ”€ 3 Β· πŸ“¦ 1 Β· πŸ“‹ 2 - 50% open Β· ⏱️ 13.12.2024):
git clone https://github.com/ICAMS/grace-tensorpotential
Joint Multidomain Pre-Training (JMP) (πŸ₯‰5 Β· ⭐ 43) - Code for From Molecules to Materials Pre-training Large Generalizable Models for Atomic Property Prediction. CC-BY-NC-4.0 pretrained ML-IAP general-tool - [GitHub](https://github.com/facebookresearch/JMP) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 6 Β· πŸ“‹ 5 - 40% open Β· ⏱️ 22.10.2024):
git clone https://github.com/facebookresearch/JMP


Unsupervised Learning

Back to top

Projects that focus on unsupervised learning (USL) for atomistic ML, such as dimensionality reduction, clustering and visualization.

DADApy (πŸ₯‡19 Β· ⭐ 110) - Distance-based Analysis of DAta-manifolds in python. Apache-2 - [GitHub](https://github.com/sissa-data-science/DADApy) (πŸ‘¨β€πŸ’» 20 Β· πŸ”€ 18 Β· πŸ“¦ 12 Β· πŸ“‹ 37 - 27% open Β· ⏱️ 20.11.2024):
git clone https://github.com/sissa-data-science/DADApy
- [PyPi](https://pypi.org/project/dadapy) (πŸ“₯ 240 / month Β· ⏱️ 20.11.2024):
pip install dadapy
ASAP (πŸ₯ˆ11 Β· ⭐ 140 Β· πŸ’€) - ASAP is a package that can quickly analyze and visualize datasets of crystal or molecular structures. MIT - [GitHub](https://github.com/BingqingCheng/ASAP) (πŸ‘¨β€πŸ’» 6 Β· πŸ”€ 28 Β· πŸ“¦ 7 Β· πŸ“‹ 25 - 24% open Β· ⏱️ 27.06.2024):
git clone https://github.com/BingqingCheng/ASAP
Show 5 hidden projects... - Sketchmap (πŸ₯ˆ8 Β· ⭐ 46 Β· πŸ’€) - Suite of programs to perform non-linear dimensionality reduction -- sketch-map in particular. GPL-3.0 C++ - Coarse-Graining-Auto-encoders (πŸ₯‰5 Β· ⭐ 21 Β· πŸ’€) - Implementation of coarse-graining Autoencoders. Unlicensed single-paper - paper-ml-robustness-material-property (πŸ₯‰5 Β· ⭐ 4 Β· πŸ’€) - A critical examination of robustness and generalizability of machine learning prediction of materials properties. BSD-3 datasets single-paper - KmdPlus (πŸ₯‰4 Β· ⭐ 4) - This module contains a class for treating kernel mean descriptor (KMD), and a function for generating descriptors with.. MIT - Descriptor Embedding and Clustering for Atomisitic-environment Framework (DECAF) ( ⭐ 2) - Provides a workflow to obtain clustering of local environments in dataset of structures. Unlicensed


Visualization

Back to top

Projects that focus on visualization (viz.) for atomistic ML.

Crystal Toolkit (πŸ₯‡24 Β· ⭐ 160) - Crystal Toolkit is a framework for building web apps for materials science and is currently powering the new Materials.. MIT - [GitHub](https://github.com/materialsproject/crystaltoolkit) (πŸ‘¨β€πŸ’» 31 Β· πŸ”€ 57 Β· πŸ“¦ 41 Β· πŸ“‹ 110 - 47% open Β· ⏱️ 02.01.2025):
git clone https://github.com/materialsproject/crystaltoolkit
- [PyPi](https://pypi.org/project/crystal-toolkit) (πŸ“₯ 2.8K / month Β· πŸ“¦ 10 Β· ⏱️ 22.10.2024):
pip install crystal-toolkit
pymatviz (πŸ₯ˆ22 Β· ⭐ 180) - A toolkit for visualizations in materials informatics. MIT general-tool probabilistic - [GitHub](https://github.com/janosh/pymatviz) (πŸ‘¨β€πŸ’» 9 Β· πŸ”€ 16 Β· πŸ“¦ 17 Β· πŸ“‹ 54 - 22% open Β· ⏱️ 31.12.2024):
git clone https://github.com/janosh/pymatviz
- [PyPi](https://pypi.org/project/pymatviz) (πŸ“₯ 6.7K / month Β· πŸ“¦ 6 Β· ⏱️ 20.12.2024):
pip install pymatviz
ZnDraw (πŸ₯ˆ21 Β· ⭐ 38) - A powerful tool for visualizing, modifying, and analysing atomistic systems. EPL-2.0 MD generative JavaScript - [GitHub](https://github.com/zincware/ZnDraw) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 4 Β· πŸ“¦ 10 Β· πŸ“‹ 360 - 27% open Β· ⏱️ 13.12.2024):
git clone https://github.com/zincware/ZnDraw
- [PyPi](https://pypi.org/project/zndraw) (πŸ“₯ 1.9K / month Β· πŸ“¦ 4 Β· ⏱️ 13.12.2024):
pip install zndraw
Chemiscope (πŸ₯‰19 Β· ⭐ 140) - An interactive structure/property explorer for materials and molecules. BSD-3 JavaScript - [GitHub](https://github.com/lab-cosmo/chemiscope) (πŸ‘¨β€πŸ’» 24 Β· πŸ”€ 34 Β· πŸ“₯ 400 Β· πŸ“¦ 6 Β· πŸ“‹ 140 - 28% open Β· ⏱️ 14.11.2024):
git clone https://github.com/lab-cosmo/chemiscope
- [npm](https://www.npmjs.com/package/chemiscope) (πŸ“₯ 27 / month Β· πŸ“¦ 3 Β· ⏱️ 15.03.2023):
npm install chemiscope
Elementari (πŸ₯‰12 Β· ⭐ 140) - Interactive browser visualizations for materials science: periodic tables, 3d crystal structures, Bohr atoms, nuclei,.. MIT JavaScript - [GitHub](https://github.com/janosh/elementari) (πŸ‘¨β€πŸ’» 2 Β· πŸ”€ 13 Β· πŸ“¦ 3 Β· πŸ“‹ 7 - 28% open Β· ⏱️ 07.10.2024):
git clone https://github.com/janosh/elementari
- [npm](https://www.npmjs.com/package/elementari) (πŸ“₯ 170 / month Β· πŸ“¦ 1 Β· ⏱️ 15.01.2024):
npm install elementari
Show 1 hidden projects... - Atomvision (πŸ₯‰12 Β· ⭐ 30 Β· πŸ’€) - Deep learning framework for atomistic image data. Custom computer-vision experimental-data rep-learn


Wavefunction methods (ML-WFT)

Back to top

Projects and models that focus on quantities of wavefunction theory methods, such as Monte Carlo techniques like deep learning variational Monte Carlo (DL-VMC), quantum chemistry methods, etc.

DeepQMC (πŸ₯‡20 Β· ⭐ 360 Β· πŸ“‰) - Deep learning quantum Monte Carlo for electrons in real space. MIT - [GitHub](https://github.com/deepqmc/deepqmc) (πŸ‘¨β€πŸ’» 13 Β· πŸ”€ 62 Β· πŸ“¦ 3 Β· πŸ“‹ 51 - 5% open Β· ⏱️ 23.10.2024):
git clone https://github.com/deepqmc/deepqmc
- [PyPi](https://pypi.org/project/deepqmc) (πŸ“₯ 450 / month Β· ⏱️ 24.09.2024):
pip install deepqmc
FermiNet (πŸ₯ˆ13 Β· ⭐ 750) - An implementation of the Fermionic Neural Network for ab-initio electronic structure calculations. Apache-2 transformer - [GitHub](https://github.com/google-deepmind/ferminet) (πŸ‘¨β€πŸ’» 18 Β· πŸ”€ 130 Β· πŸ“‹ 57 - 1% open Β· ⏱️ 08.12.2024):
git clone https://github.com/google-deepmind/ferminet
DeepErwin (πŸ₯‰10 Β· ⭐ 54) - DeepErwin is a python 3.8+ package that implements and optimizes JAX 2.x wave function models for numerical solutions.. Custom - [GitHub](https://github.com/mdsunivie/deeperwin) (πŸ‘¨β€πŸ’» 7 Β· πŸ”€ 8 Β· πŸ“₯ 13 Β· πŸ“¦ 2 Β· ⏱️ 19.12.2024):
git clone https://github.com/mdsunivie/deeperwin
- [PyPi](https://pypi.org/project/deeperwin) (πŸ“₯ 190 / month Β· ⏱️ 14.12.2021):
pip install deeperwin
Show 2 hidden projects... - ACEpsi.jl (πŸ₯‰6 Β· ⭐ 2 Β· πŸ’€) - ACE wave function parameterizations. MIT rep-eng Julia - SchNOrb (πŸ₯‰5 Β· ⭐ 61 Β· πŸ’€) - Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions. MIT


Others

Back to top

Show 1 hidden projects...

Contribution

Contributions are encouraged and always welcome! If you like to add or update projects, choose one of the following ways:

  • Open an issue by selecting one of the provided categories from the issue page and fill in the requested information.
  • Modify the projects.yaml with your additions or changes, and submit a pull request. This can also be done directly via the Github UI.

If you like to contribute to or share suggestions regarding the project metadata collection or markdown generation, please refer to the best-of-generator repository. If you like to create your own best-of list, we recommend to follow this guide.

For more information on how to add or update projects, please read the contribution guidelines. By participating in this project, you agree to abide by its Code of Conduct.

License

CC0

BibTeX Generator

Have you ever found yourself weary and uninspired from the tedious task of manually creating BibTeX entries for your paper?

There are, indeed, support tools and plugins that are bundled with reference managers such as Zotero, Mendeley, etc. These tools can automate the generation of a .bib file. To use them, you need to install a reference manager, its associated plugins, and a library of papers on your computer. However, these tools are not flawless. The BibTeX entries they generate often contain incomplete information, are poorly formatted, and include numerous unnecessary fields. You then still need to manually check and correct the entries.

There are the times you just need to cite a paper or two, and you don't want to go through the hassle of the aforementioned complex process. In such situations, a simple tool that allows you to quickly copy and paste a BibTeX entry into your .bib file would be ideal. Think of such a simple tool, I have looked around the Chrome extension store to see if there is any that can pick up the Bibtex while you are browsing the paper. I found some, but they do not really work.

Therefore, I decided to create my own tool to address this dilemma. I developed a Chrome extension that can generate the BibTeX entry for any browsing URL with just one click. I named it the 1click BibTeX. It delivers exactly what it is expected and has proven to be quite helpful. This extension, along with the Latex tools, will ensure that the manuscript's citations are properly formatted before they are delivered to the journal.

Usage

Install the 1click BibTeX extension on your Chrome browser. Then, whenever you're browsing a paper or any URL, just click on the extension icon, and the BibTeX entry will be instantly generated and copied to your clipboard. The remaining thing is just paste it to your .bib file.

BibTeX generator

I've tested the extension on numerous publishers and websites with varying structures and it works consistently as it was designed. The tested publishers include Elsevier, Wiley, ACS, IOP, AIP, APS, arXiv,...

Below are some examples of BibTeX entries generated by the extension 1click BibTeX:

@article{nguyen2019pattern,
    title = {Pattern transformation induced by elastic instability of metallic porous structures},
    author = {Cao Thang Nguyen and Duc Tam Ho and Seung Tae Choi and Doo-Man Chun and Sung Youb Kim },
    year = {2019},
    month = {2},
    journal = {Computational Materials Science},
    publisher = {Elsevier},
    volume = {157},
    pages = {17-24},
    doi = {10.1016/j.commatsci.2018.10.023},
    url = {https://www.sciencedirect.com/science/article/abs/pii/S0927025618306955?via%3Dihub},
    accessDate = {Jan 25, 2024}
}
@article{nguyen2024an,
    title = {An Enhanced Sampling Approach for Computing the Free Energy of Solid Surface and Solid–Liquid Interface},
    author = {Cao Thang Nguyen and Duc Tam Ho and Sung Youb Kim},
    year = {2024},
    month = {1},
    journal = {Advanced Theory and Simulations},
    publisher = {John Wiley & Sons, Ltd},
    volume = {7},
    number = {1},
    pages = {2300538},
    doi = {10.1002/adts.202300538},
    url = {https://onlinelibrary.wiley.com/doi/10.1002/adts.202300538},
    accessDate = {Jan 25, 2024}
}
@book{daum2003america,,
    title = {America, the Vietnam War, and the World},
    author = {Andreas W. Daum and Lloyd C. Gardner and Wilfried Mausbach},
    year = {2003},
    month = {7},
    publisher = {Cambridge University Press},
    isbn = {052100876X},
    url = {https://www.google.co.kr/books/edition/America_the_Vietnam_War_and_the_World/9kn6qYwsGs4C?hl=en&gbpv=0},
    accessDate = {Jan 25, 2024}
}
@book{rickards2011currency,
    title = {Currency Wars},
    author = {James Rickards},
    year = {2011},
    month = {11},
    publisher = {Penguin},
    isbn = {110155889X},
    url = {https://books.google.co.kr/books?id=-GDwL2s5sJoC&source=gbs_book_other_versions},
    accessDate = {Jan 25, 2024}
}
@misc{deci2024introducing,
    title = {Introducing DeciCoder-6B: The Best Multi-Language Code LLM in Its Class},
    author = {Deci},
    year = {2024},
    month = {1},
    publisher = {Deci},
    url = {https://deci.ai/blog/decicoder-6b-the-best-multi-language-code-generation-llm-in-its-class/},
    accessDate = {Jan 25, 2024}
}
@misc{kai2023forcefield,
    title = {Force-field files for "Noble gas (He, Ne and Ar) solubilities in high-pressure silicate melts calculated based on deep potential modeling"},
    author = {Wang, Kai and Lu, Xiancai and Liu, Xiandong and Yin, Kun},
    year = {2023},
    month = {3},
    publisher = {Zenodo},
    doi = {10.5281/zenodo.7751762},
    url = {https://zenodo.org/records/7751762},
    accessDate = {Jan 25, 2024}
}
  • Bibtex this page
@misc{nguyen2024bibtex,
    title = {BibTeX Generator},
    author = {Cao Thang Nguyen},
    year = {2024},
    month = {1},
    url = {https://thang.eu.org/blog/2024/01/25/bibtex_generator},
    accessDate = {Jan 25, 2024}
}

In summary, the new extension 1click BibTeX works well for most websites with varying data structures.

Accelerated Molecular Simulation Using Deep Potential Workflow with NGC

Credit: NVIDIA's blog

Molecular simulation communities have faced the accuracy-versus-efficiency dilemma in modeling the potential energy surface and interatomic forces for decades. Deep Potential, the artificial neural network force field, solves this problem by combining the speed of classical molecular dynamics (MD) simulation with the accuracy of density functional theory (DFT) calculation.1 This is achieved by using the GPU-optimized package DeePMD-kit, which is a deep learning package for many-body potential energy representation and MD simulation.2

This post provides an end-to-end demonstration of training a neural network potential for the 2D material graphene and using it to drive MD simulation in the open-source platform Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS).3 Training data can be obtained either from the Vienna Ab initio Simulation Package (VASP)4, or Quantum ESPRESSO (QE).5

A seamless integration of molecular modeling, machine learning, and high-performance computing (HPC) is demonstrated with the combined efficiency of molecular dynamics with ab initio accuracy β€” that is entirely driven through a container-based workflow. Using AI techniques to fit the interatomic forces generated by DFT, the accessible time and size scales can be boosted several orders of magnitude with linear scaling.

Deep potential is essentially a combination of machine learning and physical principles, which start a new computing paradigm as shown in Figure 1.

The image shows the new computing paradigm that combines molecular modeling, machine learning and high-performance computing to understand the interatomic forces of molecules compared to the traditional methods.


Figure 1. A new computing paradigm composed of molecular modeling, AI, and HPC. (Figure courtesy: Dr. Linfeng Zhang, DP Technology)

The entire workflow is shown in Figure 2. The data generation step is done with VASP and QE. The data preparation, model training, testing, and compression steps are done using DeePMD-kit. The model deployment is in LAMMPS.

This figure displays the workflow of training and deploying a deep potential model. The workflow includes data generation, data preparation, model training, model testing, model compression, and model deployment.


Figure 2. Diagram of the DeePMD workflow.

Why Containers?

A container is a portable unit of software that combines the application, and all its dependencies, into a single package that is agnostic to the underlying host OS.

The workflow in this post involves AIMD, DP training, and LAMMPS MD simulation. It is nontrivial and time-consuming to install each software package from source with the correct setup of the compiler, MPI, GPU library, and optimization flags.

Containers solve this problem by providing a highly optimized GPU-enabled computing environment for each step, and eliminates the time to install and test software.

The NGC catalog, a hub of GPU-optimized HPC and AI software, carries a whole of HPC and AI containers that can be readily deployed on any GPU system. The HPC and AI containers from the NGC catalog are updated frequently and are tested for reliability and performance β€” necessary to speed up the time to solution.

These containers are also scanned for Common Vulnerabilities and Exposure (CVEs), ensuring that they are devoid of any open ports and malware. Additionally, the HPC containers support both Docker and Singularity runtimes, and can be deployed on multi-GPU and multinode systems running in the cloud or on-premises.

Training data generation

The first step in the simulation is data generation. We will show you how you can use VASP and Quantum ESPRESSO to run AIMD simulations and generate training datasets for DeePMD. All input files can be downloaded from the GitHub repository using the following command:

git clone https://github.com/deepmodeling/SC21_DP_Tutorial.git

VASP

A two-dimensional graphene system with 98-atoms is used as shown in Figure 3.6 To generate the training datasets, 0.5ps NVT AIMD simulation at 300 K is performed. The time step chosen is 0.5fs. The DP model is created using 1000 time steps from a 0.5ps MD trajectory at a fixed temperature.

Due to the short simulation time, the training dataset contains consecutive system snapshots, which are highly correlated. Generally, the training dataset should be sampled from uncorrelated snapshots with various system conditions and configurations. For this example, we used a simplified training data scheme. For production DP training, using DP-GEN is recommended to utilize the concurrent learning scheme to efficiently explore more combinations of conditions.7

The projector-augmented wave pseudopotentials are employed to describe the interactions between the valence electrons and frozen cores. The generalized gradient approximation exchangeβˆ’correlation functional of Perdewβˆ’Burkeβˆ’Ernzerhof. Only the Ξ“-point was used for k-space sampling in all systems.

This figure displays the top view of a single layer graphene system with 98 carbon atoms.


Figure 3. A graphene system composed of 98 carbon atoms is used in AIMD simulation.

Quantum Espresso

The AIMD simulation can also be carried out using Quantum ESPRESSO, available as a container from the NGC Catalog. Quantum ESPRESSO is an integrated suite of open-source computer codes for electronic-structure calculations and materials modeling at the nanoscale based on density-functional theory, plane waves, and pseudopotentials. The same graphene structure is used in the QE calculations. The following command can be used to start the AIMD simulation:

$ singularity exec --nv docker://nvcr.io/hpc/quantum_espresso:qe-6.8 cp.x
< c.md98.cp.in

Training data preparation

Once the training data is obtained from AIMD simulation, we want to convert its format using dpdata so that it can be used as input to the deep neural network. The dpdata package is a format conversion toolkit between AIMD, classical MD, and DeePMD-kit.

You can use the convenient tool dpdata to convert data directly from the output of first-principles packages to the DeePMD-kit format. For deep potential training, the following information of a physical system has to be provided: atom type, box boundary, coordinate, force, viral, and system energy.

A snapshot, or a frame of the system, contains all these data points for all atoms at one-time step, which can be stored in two formats, that is raw and npy.

The first format raw is plain text with all information in one file, and each line of the file represents a snapshot. Different system information is stored in different files named as box.raw, coord.raw, force.raw, energy.raw, and virial.raw. We recommended you follow these naming conventions when preparing the training files.

An example of force.raw:

$ cat force.raw
-0.724  2.039 -0.951  0.841 -0.464  0.363
 6.737  1.554 -5.587 -2.803  0.062  2.222
-1.968 -0.163  1.020 -0.225 -0.789  0.343

This force.raw contains three frames, with each frame having the forces of two atoms, resulting in three lines and six columns. Each line provides all three force components of two atoms in one frame. The first three numbers are the three force components of the first atom, while the next three numbers are the force components of the second atom.

The coordinate file coord.raw is organized similarly. In box.raw, the nine components of the box vectors should be provided on each line. In virial.raw, the nine components of the virial tensor should be provided on each line in the order XX XY XZ YX YY YZ ZX ZY ZZ. The number of lines of all raw files should be identical. We assume that the atom types do not change in all frames. It is provided by type.raw, which has one line with the types of atoms written one by one.

The atom types should be integers. For example, the type.raw of a system that has two atoms with zero and one:

$ cat type.raw
0 1

It is not a requirement to convert the data format to raw, but this process should give a sense on the types of data that can be used as inputs to DeePMD-kit for training.

The easiest way to convert the first-principles results to the training data is to save them as numpy binary data.

For VASP output, we have prepared an outcartodata.py script to process the VASP OUTCAR file. By running the commands:

$ cd SC21_DP_Tutorial/AIMD/VASP/
$ singularity exec --nv docker://nvcr.io/hpc/deepmd-kit:v2.0.3 python outcartodata.py
$ mv deepmd_data ../../DP/

For QE output:

$ cd SC21_DP_Tutorial/AIMD/QE/
$ singularity exec --nv docker://nvcr.io/hpc/deepmd-kit:v2.0.3 python logtodata.py
$ mv deepmd_data ../../DP/

A folder called deepmd_data is generated and moved to the training directory. It generates five sets 0/set.000, 1/set.000, 2/set.000, 3/set.000, 4/set.000, with each set containing 200 frames. It is not required to take care of the binary data files in each of the set.* directories. The path containing the set.* folder and type.raw file is called a system. If you want to train a nonperiodic system, an empty nopbc file should be placed under the system directory. box.raw is not necessary as it is a nonperiodic system.

We are going to use three of the five sets for training, one for validating, and the remaining one for testing.

Deep Potential model training

The input of the deep potential model is a descriptor vector containing the system information mentioned previously. The neural network contains several hidden layers with a composition of linear and nonlinear transformations. In this post, a three layer-neural network with 25, 50 and 100 neurons in each layer is used. The target value, or the label, for the neural network to learn is the atomic energies. The training process optimizes the weights and the bias vectors by minimizing the loss function.

The training is initiated by the command where input.json contains the training parameters:

$ singularity exec --nv docker://nvcr.io/hpc/deepmd-kit:v2.0.3 dp train input.json

The DeePMD-kit prints detailed information on the training and validation data sets. The data sets are determined by training_data and validation_data as defined in the training section of the input script. The training dataset is composed of three data systems, while the validation data set is composed of one data system. The number of atoms, batch size, number of batches in the system, and the probability of using the system are all shown in Figure 4. The last column presents if the periodic boundary condition is assumed for the system.

This image is a screenshot of the DP training output. Summaries of the training and validation dataset are shown with detailed information on the number of atoms, batch size, number of batches in the system and the probability of using the system.


Figure 4. Screenshot of the DP training output.

During the training, the error of the model is tested every disp_freq training step with the batch used to train the model and with numb_btch batches from the validating data. The training error and validation error are printed correspondingly in the file disp_file (default is lcurve.out). The batch size can be set in the input script by the key batch_size in the corresponding sections for training and validation data set.

An example of the output:

#  step      rmse_val    rmse_trn    rmse_e_val  rmse_e_trn    rmse_f_val  rmse_f_trn         lr
      0      3.33e+01    3.41e+01      1.03e+01    1.03e+01      8.39e-01    8.72e-01    1.0e-03
    100      2.57e+01    2.56e+01      1.87e+00    1.88e+00      8.03e-01    8.02e-01    1.0e-03
    200      2.45e+01    2.56e+01      2.26e-01    2.21e-01      7.73e-01    8.10e-01    1.0e-03
    300      1.62e+01    1.66e+01      5.01e-02    4.46e-02      5.11e-01    5.26e-01    1.0e-03
    400      1.36e+01    1.32e+01      1.07e-02    2.07e-03      4.29e-01    4.19e-01    1.0e-03
    500      1.07e+01    1.05e+01      2.45e-03    4.11e-03      3.38e-01    3.31e-01    1.0e-03

The training error reduces monotonically with training steps as shown in Figure 5. The trained model is tested on the test dataset and compared with the AIMD simulation results. The test command is:

$ singularity exec --nv docker://nvcr.io/hpc/deepmd-kit:v2.0.3 dp test -m frozen_model.pb -s deepmd_data/4/ -n 200 -d detail.out

This image shows the total training loss, energy loss, force loss and learning rate decay with training steps from 0 to 1,000,000. Both the training and validation loss decrease monotonically with training steps.


Figure 5. Training loss with steps

The results are shown in Figure 6.

This image displays the inferenced energy and force in the y-axis, and the ground true on the x-axis. The inferenced values soundly coincide with the ground truth with all data distributed in the diagonal direction.


Figure 6. Test of the prediction accuracy of trained DP model with AIMD energies and forces.

Model export and compression

After the model has been trained, a frozen model is generated for inference in MD simulation. The process of saving neural network from a checkpoint is called β€œfreezing” a model:

$ singularity exec --nv docker://nvcr.io/hpc/deepmd-kit:v2.0.3 dp freeze -o graphene.pb

After the frozen model is generated, the model can be compressed without sacrificing its accuracy; while greatly speeding up the inference performance in MD. Depending on simulation and training setup, model compression can boost performance by 10X, and reduce memory consumption by 20X when running on GPUs.

The frozen model can be compressed using the following command where -i refers to the frozen model and -o points to the output name of the compressed model:

$ singularity exec --nv docker://nvcr.io/hpc/deepmd-kit:v2.0.3 dp compress -i graphene.pb -o graphene-compress.pb

Model deployment in LAMMPS

A new pair-style has been implemented in LAMMPS to deploy the trained neural network in prior steps. For users familiar with the LAMMPS workflow, only minimal changes are needed to switch to deep potential. For instance, a traditional LAMMPS input with Tersoff potential has the following setting for potential setup:

pair_style      tersoff
pair_coeff      * * BNC.tersoff C

To use deep potential, replace previous lines with:

pair_style      deepmd graphene-compress.pb
pair_coeff      * *

The pair_style command in the input file uses the DeePMD model to describe the atomic interactions in the graphene system.

The graphene-compress.pb file represents the frozen and compressed model for inference. The graphene system in MD simulation contains 1,560 atoms. Periodic boundary conditions are applied in the lateral x– and y-directions, and free boundary is applied to the z-direction. The time step is set as 1 fs. The system is placed under NVT ensemble at temperature 300 K for relaxation, which is consistent with the AIMD setup. The system configuration after NVT relaxation is shown in Figure 7. It can be observed that the deep potential can describe the atomic structures with small ripples in the cross-plane direction. After 10ps NVT relaxation, the system is placed under NVE ensemble to check system stability.

The image displays the side view of the single layer graphene system after thermal relaxation in LAMMPS.


Figure 7. Atomic configuration of the graphene system after relaxation with deep potential.

The system temperature is shown in Figure 8.

The image displays the temperature profiles of the graphene system under NVT and NVE ensembles from 0 to 20 picoseconds. The first 10 picosecond is NVT and the second 10 picosecond is NVE.


Figure 8. System temperature under NVT and NVE ensembles. The MD system driven by deep potential is very stable after relaxation.

To validate the accuracy of the trained DP model, the calculated radial distribution function (RDF) from AIMD, DP and Tersoff, are plotted in Figure 9. The DP model-generated RDF is very close to that of AIMD, which indicates that the crystalline structure of graphene can be well presented by the DP model.

This image displays the plotted radial distribution function from three different methods, including DP, Tersoff and AIMD, which are denoted in black, red and blue solid lines respectively.


Figure 9. Radial distribution function calculated by AIMD, DP and Tersoff potential, respectively. It can be observed that the RDF calculated by DP is very close to that of AIMD.

Conclusion

This post demonstrates a simple case study of graphene under given conditions. The DeePMD-kit package streamlines the workflow from AIMD to classical MD with deep potential, providing the following key advantages:

Highly automatic and efficient workflow implemented in the TensorFlow framework. APIs with popular DFT and MD packages such as VASP, QE, and LAMMPS. Broad applications in organic molecules, metals, semiconductors, insulators, and more. Highly efficient code for HPC with MPI and GPU support. Modularization for easy adoption by other deep learning potential models. Furthermore, the use of GPU-optimized containers from the NGC catalog simplifies and accelerates the overall workflow by eliminating the steps to install and configure software. To train a comprehensive model for other applications, download the DeepMD Kit Container from the NGC catalog.

References

[1] Jia W, Wang H, Chen M, Lu D, Lin L, Car R, E W and Zhang L 2020 Pushing the limit of molecular dynamics with ab initio accuracy to 100 million atoms with machine learning IEEE Press 5 1-14

[2] Wang H, Zhang L, Han J and E W 2018 DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics Computer Physics Communications 228 178-84

[3] Plimpton S 1995 Fast Parallel Algorithms for Short-Range Molecular Dynamics Journal of Computational Physics 117 1-19

[4] Kresse G and Hafner J 1993 Ab initio molecular dynamics for liquid metals Physical Review B 47 558-61

[5] Giannozzi P, Baroni S, Bonini N, Calandra M, Car R, Cavazzoni C, Ceresoli D, Chiarotti G L, Cococcioni M, Dabo I, Dal Corso A, de Gironcoli S, Fabris S, Fratesi G, Gebauer R, Gerstmann U, Gougoussis C, Kokalj A, Lazzeri M, Martin-Samos L, Marzari N, Mauri F, Mazzarello R, Paolini S, Pasquarello A, Paulatto L, Sbraccia C, Scandolo S, Sclauzero G, Seitsonen A P, Smogunov A, Umari P and Wentzcovitch R M 2009 QUANTUM ESPRESSO: a modular and open-source software project for quantum simulations of materials Journal of Physics: Condensed Matter 21 395502

[6] Humphrey W, Dalke A and Schulten K 1996 VMD: Visual molecular dynamics Journal of Molecular Graphics 14 33-8

[7] Yuzhi Zhang, Haidi Wang, Weijie Chen, Jinzhe Zeng, Linfeng Zhang, Han Wang, and Weinan E, DP-GEN: A concurrent learning platform for the generation of reliable deep learning based potential energy models, Computer Physics Communications, 2020, 107206.