Installation¶
Portable installation¶
To reuse modules and utilities within MULTICOM_ligand
in other projects, one can simply use pip
pip install multicom_ligand
Full installation¶
To reproduce, customize, or extend the MULTICOM_ligand
benchmark, we recommend fully installing MULTICOM_ligand
using mamba
as follows:
First, install mamba
for dependency management (as a fast alternative to Anaconda)
wget "https://github.com/conda-forge/miniforge/releases/download/24.11.3-0/Miniforge3-$(uname)-$(uname -m).sh"
bash Miniforge3-$(uname)-$(uname -m).sh # accept all terms and install to the default location
rm Miniforge3-$(uname)-$(uname -m).sh # (optionally) remove installer after using it
source ~/.bashrc # alternatively, one can restart their shell session to achieve the same result
Install dependencies for each method’s environment (as desired)
# clone project
sudo apt-get install git-lfs # NOTE: run this if you have not already installed `git-lfs`
git lfs install
git clone https://github.com/BioinfoMachineLearning/MULTICOM_ligand --recursive
cd MULTICOM_ligand
# create conda environments (~80 GB total)
# - MULTICOM_ligand environment # (~15 GB)
mamba env create -f environments/multicom_ligand_environment.yaml
conda activate MULTICOM_ligand # NOTE: one still needs to use `conda` to (de)activate environments
pip3 install -e .
pip3 install numpy==1.26.4 --no-dependencies
pip3 install prody==2.4.1 --no-dependencies
# - casp15_ligand_scoring environment (~3 GB)
mamba env create -f environments/casp15_ligand_scoring_environment.yaml
conda activate casp15_ligand_scoring # NOTE: one still needs to use `conda` to (de)activate environments
pip3 install -e .
# - DiffDock environment (~13 GB)
mamba env create -f environments/diffdock_environment.yaml --prefix forks/DiffDock/DiffDock/
conda activate forks/DiffDock/DiffDock/ && pip3 install pyg-lib -f https://data.pyg.org/whl/torch-2.1.0+cu118.html # NOTE: one still needs to use `conda` to (de)activate environments
# - FABind environment (~6 GB)
mamba env create -f environments/fabind_environment.yaml --prefix forks/FABind/FABind/
conda activate forks/FABind/FABind/ # NOTE: one still needs to use `conda` to (de)activate environments
# - DynamicBind environment (~13 GB)
mamba env create -f environments/dynamicbind_environment.yaml --prefix forks/DynamicBind/DynamicBind/
conda activate forks/DynamicBind/DynamicBind/ && pip3 install pyg-lib -f https://data.pyg.org/whl/torch-2.1.0+cu118.html # NOTE: one still needs to use `conda` to (de)activate environments
# - NeuralPLexer environment (~14 GB)
mamba env create -f environments/neuralplexer_environment.yaml --prefix forks/NeuralPLexer/NeuralPLexer/
conda activate forks/NeuralPLexer/NeuralPLexer/ # NOTE: one still needs to use `conda` to (de)activate environments
cd forks/NeuralPLexer/ && pip3 install -e . && cd ../../
# - RoseTTAFold-All-Atom environment (~14 GB) - NOTE: after running these commands, follow the installation instructions in `forks/RoseTTAFold-All-Atom/README.md` starting at Step 4 (with `forks/RoseTTAFold-All-Atom/` as the current working directory)
mamba env create -f environments/rfaa_environment.yaml --prefix forks/RoseTTAFold-All-Atom/RFAA/
conda activate forks/RoseTTAFold-All-Atom/RFAA/ # NOTE: one still needs to use `conda` to (de)activate environments
cd forks/RoseTTAFold-All-Atom/rf2aa/SE3Transformer/ && pip3 install --no-cache-dir -r requirements.txt && python3 setup.py install && cd ../../../../
# - AutoDock Vina Tools environment (~1 GB)
mamba env create -f environments/adfr_environment.yaml --prefix forks/Vina/ADFR/
conda activate forks/Vina/ADFR/ # NOTE: one still needs to use `conda` to (de)activate environments
# - P2Rank (~0.5 GB)
wget -P forks/P2Rank/ https://github.com/rdk/p2rank/releases/download/2.4.2/p2rank_2.4.2.tar.gz
tar -xzf forks/P2Rank/p2rank_2.4.2.tar.gz -C forks/P2Rank/
rm forks/P2Rank/p2rank_2.4.2.tar.gz
Download checkpoints (~8.25 GB total)
DynamicBind checkpoint (~0.25 GB)¶
cd forks/DynamicBind/ wget https://zenodo.org/records/10137507/files/workdir.zip unzip workdir.zip rm workdir.zip