rag chat with pdf local llm
1.0.0
pip install -r requirements.txt
pip install -U pydantic==1.10.9
streamlit run chat.py
Easiest way to run a local LLM is to use LM Studio: https://lmstudio.ai/
The LLM I use in my conference talks (works fine on a MBP M1 Max with 64GB RAM):