Setting Up Local Language Models for Your App
Setting up local language models for your app can significantly enhance its capabilities, enabling it to understand and generate text in multiple languages without relying on external APIs. By integrating local language models, you can improve privacy, reduce latency, and ensure continuous functionality even in offline environments. Here's a comprehensive guide on how to set up local language models for your application:
Steps:
For cloud version LLM change:
Step 1
Visit the chat screen and you will be to see the default LLM selected.
Step 2
Click on it and you will get a drop down of various LLM's available to choose.
Step 3
Choose the LLM of your choice.
Video Demo
For Open source llm change:
Step 1
For open source you have to edit .env file with LLM_NAME with their desired LLM name.
Step 2
All the supported LLM providers are here application/llm and you can check what env variable are needed for each List of latest supported LLMs are https://github.com/arc53/DocsGPT/blob/main/application/llm/llm_creator.py (opens in a new tab)
Step 3
Visit application/llm and select the file of your selected llm and there you will find the speicifc requirements needed to be filled in order to use it,i.e API key of that llm.