Skip to main content

Open Source LLMs with LocalAI

· One min read
Ajay Chintala
Vignesh Aigal

Now build AI Apps using Open Source LLMs like Llama2 on LLMStack using LocalAI

LLMStack now includes LocalAI support which means you can now run Open Source LLMs like Llama2, stable diffusion etc., locally and build apps on top of them using LLMStack.

LocalAI

What is LocalAI?

LocalAI is a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. Read more about LocalAI here.

How to use LocalAI with LLMStack?

To use LocalAI with LLMStack, you need to have LocalAI running on your machine. You can follow the deployment instructions here to install LocalAI on your machine. Once LocalAI is up and running, you can configure LLMStack to use LocalAI by going to Settings and filling in the LocalAI Base URL and LocalAI API Key if any. Once done, click Update to save the configuration.

LocalAI Configuration

Once LocalAI is configured, you can use it in your apps by selecting LocalAI as the provider for processor and selecting the processor and model you want to use.

Are there any other open source LLM frameworks that you would like to see on LLMStack? Let us know in our github discussions here.