Skip to content
Snippets Groups Projects
Commit 82bf0b0d authored by William E Warriner's avatar William E Warriner
Browse files

fix readme

parent 7958bd51
No related branches found
No related tags found
No related merge requests found
......@@ -34,14 +34,13 @@ It is recommended to use an HPC Desktop job in the Interactive Apps section of <
1. `conda env create --file environment.yml`
1. Obtain the rendered UAB RC Documentation pages by running `pull-site.sh`.
1. Setup `ollama` by running `setup-ollama.sh`.
1. Start the `ollama` server by running `./ollama serve`.
### Once-per-job Setup
1. Load the Miniforge module with `module load Miniforge3`.
1. Start the `ollama` server application with `./ollama serve`.
### To Run
### To Run the Example
1. Run the Jupyter notebook `main.ipynb`.
- At time of writing, the Documentation pages are enough data that it takes about 7-10 minutes to generate the embeddings. Please be patient.
......@@ -55,7 +54,7 @@ The models are supplied by `ollama`.
- LLM: <https://ollama.com/library/llama3.1>
- Embedding: <https://ollama.com/library/bge-m3>
## Using other versions and models
## Using Other Versions and Models
Newer versions of `ollama` are compressed as `.tar.gz` files on the GitHub releases page (<https://github.com/ollama/ollama/releases>). When modifying the `setup-ollama.sh` script to use these models, you will need to take this into account.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment