I am trying to install the RXNFP module in Python, but I am encountering errors during the installation process. I have followed the official instructions which suggest using Conda for handling dependencies, but I continue to face issues with pip install rxnfp, i also tried using the git clone command.
I already tried updating pip and installing the rust with this command curl –proto ‘=https’ –tlsv1.2 -sSf https://sh.rustup.rs | sh
Could someone help me understand what might be going wrong and how to fix it?
Environment Details:
Operating System: macOS 14.4.1 (23E224)
Python version: 3.6
Conda version: conda 23.7.4
Thank you in advance for any help or suggestions!
Here are the steps I have followed and the errors encountered:
conda create -n rxnfp python=3.6 -y
conda activate rxnfp
conda install -c rdkit rdkit=2020.03.3 -y
conda install -c tmap tmap -y
pip install rxnfp
using this commands i received the following error:
Requirement already satisfied: setuptools in /Users/diegodelozada/anaconda3/envs/rxnfp/lib/python3.6/site-packages (from zc.lockfile->CherryPy>=18.1.0->faerun==0.3.20->rxnfp==0.1.0) (58.0.4)
Collecting jaraco.context>=4.1
Using cached jaraco.context-4.1.1-py3-none-any.whl (4.4 kB)
Building wheels for collected packages: tokenizers
Building wheel for tokenizers (pyproject.toml) … error
ERROR: Command errored out with exit status 1:
command: /Users/diegodelozada/anaconda3/envs/rxnfp/bin/python /Users/diegodelozada/anaconda3/envs/rxnfp/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /var/folders/bf/7hsdtc_x6ls7z08gzy6qkbq00000gn/T/tmpoxxki8n2
cwd: /private/var/folders/bf/7hsdtc_x6ls7z08gzy6qkbq00000gn/T/pip-install-2cckwo_g/tokenizers_6e884e60d0d24a9b97b39bb28e5a0cd7
Complete output (51 lines):
running bdist_wheel
running build
running build_py
creating build
creating build/lib.macosx-10.9-x86_64-3.6
creating build/lib.macosx-10.9-x86_64-3.6/tokenizers
copying py_src/tokenizers/init.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers
creating build/lib.macosx-10.9-x86_64-3.6/tokenizers/models
copying py_src/tokenizers/models/init.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/models
creating build/lib.macosx-10.9-x86_64-3.6/tokenizers/decoders
copying py_src/tokenizers/decoders/init.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/decoders
creating build/lib.macosx-10.9-x86_64-3.6/tokenizers/normalizers
copying py_src/tokenizers/normalizers/init.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/normalizers
creating build/lib.macosx-10.9-x86_64-3.6/tokenizers/pre_tokenizers
copying py_src/tokenizers/pre_tokenizers/init.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/pre_tokenizers
creating build/lib.macosx-10.9-x86_64-3.6/tokenizers/processors
copying py_src/tokenizers/processors/init.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/processors
creating build/lib.macosx-10.9-x86_64-3.6/tokenizers/trainers
copying py_src/tokenizers/trainers/init.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/trainers
creating build/lib.macosx-10.9-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/init.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/implementations
copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/implementations
creating build/lib.macosx-10.9-x86_64-3.6/tokenizers/tools
copying py_src/tokenizers/tools/init.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/tools
copying py_src/tokenizers/tools/visualizer.py -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/tools
copying py_src/tokenizers/init.pyi -> build/lib.macosx-10.9-x86_64-3.6/tokenizers
copying py_src/tokenizers/models/init.pyi -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/models
copying py_src/tokenizers/decoders/init.pyi -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/decoders
copying py_src/tokenizers/normalizers/init.pyi -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/normalizers
copying py_src/tokenizers/pre_tokenizers/init.pyi -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/pre_tokenizers
copying py_src/tokenizers/processors/init.pyi -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/processors
copying py_src/tokenizers/trainers/init.pyi -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/trainers
copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.macosx-10.9-x86_64-3.6/tokenizers/tools
running build_ext
running build_rust
error: can’t find Rust compiler
If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
To update pip, run:
pip install --upgrade pip
and then retry package installation.
If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
Diego De Lozada is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.