Overview:
lluminous is a fast and light open chat UI that allows users to enjoy a seamless chat experience with various providers such as OpenAI, Anthropic, Groq, and others. The platform offers local model support through llama.cpp, open AI functionalities, and a wide range of tools for easy access and integration.
Features:
- Multiple Providers: Easily plug in API keys from various providers.
- Local Models: Utilize llama.cpp for local model support.
- OpenAI and Other Providers: Access models from OpenAI, Anthropic, Groq, and 50+ others.
- Multimodal Input: Upload, paste, or share links to images.
- Image Generation: Utilize DALL-E 3 for image generation.
- Multi-Shot Prompting: Edit, delete, regenerate messages as needed.
- Pre-Filled Responses: Supports pre-filled responses where available.
- Privacy: All conversation history and keys are stored locally and kept only in the user’s browser.
Installation:
To install lluminous, follow these steps:
- Clone the repository.
- Install and start the client: Run
npm i && npm run dev
. Access the client at http://localhost:5173. - Install and start the server: Navigate to the server directory, build with
go build
, and start withPASSWORD="chooseapassword" ./server -sandbox <sandbox_path>
. Access the server at http://localhost:8081. Use the server address and the selected password in the chat UI.
Note: The sandbox feature currently only functions on macOS due to macOS-specific sandboxing features.
Summary:
lluminous is a versatile chat UI that provides users with efficient access to various providers like OpenAI, Anthropic, and Groq. With support for local models, versatile tools, and strong privacy measures, lluminous offers a seamless and secure chat experience for users. Installation is straightforward, allowing users to set up both client and server components for full functionality.