Skip to content

Installation Guide

This guide will help you set up Ollama PDF RAG on your system.

Prerequisites

Before installing Ollama PDF RAG, ensure you have:

  1. Python 3.9 or higher installed
  2. pip (Python package installer)
  3. git installed
  4. Ollama installed on your system

Installing Ollama

  1. Visit Ollama's website to download and install the application
  2. After installation, pull the required models:
    ollama pull llama3.2  # or your preferred model
    ollama pull nomic-embed-text
    

Installing Ollama PDF RAG

  1. Clone the repository:

    git clone https://github.com/tonykipkemboi/ollama_pdf_rag.git
    cd ollama_pdf_rag
    

  2. Create and activate a virtual environment:

    # On macOS/Linux
    python -m venv venv
    source venv/bin/activate
    
    # On Windows
    python -m venv venv
    .\venv\Scripts\activate
    

  3. Install dependencies:

    pip install -r requirements.txt
    

Verifying Installation

  1. Start Ollama in the background
  2. Run the application:
    python run.py
    
  3. Open your browser to http://localhost:8501

Troubleshooting

Common Issues

ONNX DLL Error

If you see this error:

DLL load failed while importing onnx_copy2py_export: a dynamic link Library (DLL) initialization routine failed.

Try these solutions:

  1. Install Microsoft Visual C++ Redistributable:
  2. Download both x64 and x86 versions from Microsoft's website
  3. Restart your computer

  4. Or reinstall ONNX Runtime:

    pip uninstall onnxruntime onnxruntime-gpu
    pip install onnxruntime
    

CPU-Only Systems

For systems without a GPU:

  1. Install CPU version of ONNX Runtime:

    pip uninstall onnxruntime-gpu
    pip install onnxruntime
    

  2. Adjust chunk size if needed:

  3. Reduce to 500-1000 for memory issues
  4. Increase overlap for better context

Next Steps