A local Q&A engine using llama and FAISS

FAISS implements similarity search. Sentence Transformers encodes sentences into vectors FAISS can use. Llama can be configured to perform semantic searches in a FAISS vector store. By combining all of the above together we can build a local GenAI powered assistant capable of performing semantic queries to extract information from a local corpus of documents, let’s see how.


For this example I’m using llama-2-7b-chat.ggmlv3.q4_0.bin and this document (notes.txt) I’d like my agent to provide me a...

 •  0 comments  •  flag
Share on Twitter
Published on April 01, 2024 08:08
No comments have been added yet.