Skip to content
Snippets Groups Projects
Commit 10871b19 authored by Manfred Michaelis's avatar Manfred Michaelis :boom:
Browse files

fix(removed redundant start code block)

parent 9a8ff18f
Branches main
No related tags found
No related merge requests found
......@@ -23,17 +23,13 @@ AI RAG Agent is a robust Rust implementation of Retrieval-Augmented Generation (
To use AI RAG Agent, you need to download and convert Hugging Face models to the GGUF format. You can use the provided Python script to automate this process.
```bash
### 1. Clone llama.cpp
```bash
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
```
### 2. Install Git LFS
```bash
# Debian/Ubuntu
curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | sudo bash
......@@ -48,14 +44,12 @@ git lfs install
### 3. Set Up Python Environment
First, install pyenv and pyenv-virtualenv:
```bash
curl https://pyenv.run | bash
git clone https://github.com/pyenv/pyenv-virtualenv.git $(pyenv root)/plugins/pyenv-virtualenv
```
Add to your shell configuration (~/.bashrc, ~/.zshrc):
```bash
echo 'eval "$(pyenv init -)"' >> ~/.bashrc
echo 'eval "$(pyenv virtualenv-init -)"' >> ~/.bashrc
......@@ -64,7 +58,6 @@ source ~/.bashrc
```
Create Python environment:
```bash
pyenv install 3.8.6
pyenv virtualenv 3.8.6 llama
......@@ -72,7 +65,6 @@ pyenv activate llama
```
### 4. Install Dependencies
```bash
cd llama.cpp
pip install -r requirements.txt
......@@ -81,7 +73,6 @@ pip install -r requirements.txt
### 5. Model Setup
Download and convert Hugging Face models:
```bash
# Download model
git clone https://huggingface.co/mattshumer/Reflection-Llama-3.1-70B ../models/Reflection-Llama-3.1-70B
......@@ -90,26 +81,6 @@ git clone https://huggingface.co/mattshumer/Reflection-Llama-3.1-70B ../models/R
python convert_hf_to_gguf.py ../models/Reflection-Llama-3.1-70B
```
## Usage
```rust
use rag_rs::{RagConfig, RagEngine};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let config = RagConfig::new()
.with_model_path("path/to/model.gguf")
.with_context_size(2048);
let engine = RagEngine::new(config)?;
let response = engine.generate("Tell me about Rust programming.")?;
println!("Generated response: {}", response);
Ok(())
}
```
## Configuration Options
| Option | Description | Default |
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment