I would like to test the newest chattr package to run an LLM model in Rstudio. Unfortunately, I’m not able to run it. I would like to run the llama model. We can download this model like described here:
git clone --recurse-submodules https://github.com/kuvaus/LlamaGPTJ-chat
cd LlamaGPTJ-chat
And build it:
mkdir build
cd build
cmake ..
cmake --build . --parallel
This seems to work since I have the folder:
Now we need to setup the model like described here:
remotes::install_github("mlverse/chattr")
library(chattr)
chattr_use("llamagpt")
#>
#> ── chattr
#> • Provider: LlamaGPT
#> • Path/URL: ~/LlamaGPTJ-chat/build/bin/chat
#> • Model: ~/ggml-gpt4all-j-v1.3-groovy.bin
#> • Label: GPT4ALL 1.3 (LlamaGPT)
chattr_defaults()
#>
#> ── chattr ──────────────────────────────────────────────────────────────────────
#>
#> ── Defaults for: Default ──
#>
#> ── Prompt:
#> • Use the R language, the tidyverse, and tidymodels
#>
#> ── Model
#> • Provider: LlamaGPT
#> • Path/URL: ~/LlamaGPTJ-chat/build/bin/chat
#> • Model: ~/ggml-gpt4all-j-v1.3-groovy.bin
#> • Label: GPT4ALL 1.3 (LlamaGPT)
#>
#> ── Model Arguments:
#> • threads: 4
#> • temp: 0.01
#> • n_predict: 1000
#>
#> ── Context:
#> Max Data Files: 0
#> Max Data Frames: 0
#> ✖ Chat History
#> ✖ Document contents
chattr_defaults_save()
chattr_test()
#> Error in c("process_initialize(self, private, command, args, stdin, stdout, ", : ! Native call to `processx_exec` failed
#> Caused by error in `chain_call(c_processx_exec, command, c(command, args), pty, pty_options, …`:
#> ! cannot start processx process '/Users/quinten/LlamaGPTJ-chat/build/bin/chat' (system error 2, No such file or directory) @unix/processx.c:613 (processx_exec)
Created on 2024-04-26 with reprex v2.1.0
Unfortunately, I get the error above. It looks like it can’t find certain files but I’m not sure why this happens. So I was wondering if anyone knows how to solve this issue?
Recognized by R Language Collective