Getting Started with LLMKit
Welcome to LLMKit! This guide will walk you through setting up and running LLMKit on your local machine using two methods: Docker Compose and the llmkit
binary. Whether you're a beginner or an experienced developer, these steps will get you up and running quickly. We'll cover starting and stopping services, along with key details to ensure a smooth setup.
Prerequisites
- For Docker Compose:
- Docker and Docker Compose must be installed on your machine.
- For the binary method:
- No prerequisites are strictly required! The provided
install.sh
script will install necessary dependencies (Rust and Bun) if they aren't already present.
- No prerequisites are strictly required! The provided
Note: LLMKit is tested on Linux and macOS. If you're on Windows, consider using WSL (Windows Subsystem for Linux) or adapting the scripts as needed.
Method 1: Using Docker Compose
This method uses Docker to run LLMKit's services in containers, making setup straightforward and portable.
Steps
- Clone the Repository
- Open a terminal and run:
bash
git clone https://github.com/llmkit-ai/llmkit.git cd llmkit
- Open a terminal and run:
- Set Up the Environment File
- Copy the example environment file:
bash
cp .env.example .env
- Open
.env
in a text editor and add your API keys (e.g., for OpenRouter or other providers) and a secureJWT_SECRET
. For example:textOPENROUTER_API_KEY=your_openrouter_key_here JWT_SECRET=your_secure_random_string # Add other API keys as needed (e.g., ANTHROPIC_API_KEY, GOOGLE_API_KEY)
- You can obtain API keys from the respective providers (e.g., OpenRouter). The
JWT_SECRET
should be a random, secure string for authentication.
- Copy the example environment file:
- Start the Services
- Launch the containers in detached mode:
bash
docker-compose up -d
- This starts both the backend (Rust API) and frontend (Nuxt.js UI) services.
- Launch the containers in detached mode:
- Access the Application
- UI: Open your browser and go to http://localhost:3000
- API: Access the API at http://localhost:8000
- Stop the Services
- To shut down the containers and clean up:
bash
docker-compose down
- To shut down the containers and clean up:
Method 2: Using the llmkit
Binary
This method builds and runs LLMKit directly on your machine using the llmkit
binary, giving you more control and flexibility.
Steps
- Clone the Repository
- In a terminal, run:
bash
git clone https://github.com/llmkit-ai/llmkit.git cd llmkit
- In a terminal, run:
- Run the Installation Script
- Execute the setup script:
bash
./install.sh
- This script:
- Installs Rust and Bun if they’re not already present.
- Builds the backend binary.
- Sets up the frontend dependencies.
- Creates a default
.env
file in thebackend
directory.
- Execute the setup script:
- Set Up the Environment File
- Navigate to the
backend
directory and edit the.env
file:textcd backend nano .env # or use your preferred editor
- Add your API keys and set a secure
JWT_SECRET
. For example:textOPENROUTER_API_KEY=your_openrouter_key_here JWT_SECRET=your_secure_random_string # Add other API keys as needed
- Save and exit the editor.
- Navigate to the
- Start the Application
- From the root directory, run:
bash
llmkit start
- This command:
- Starts the backend server (Rust API).
- Starts the frontend server (Nuxt.js UI).
- Creates and migrates the SQLite database (
llmkit.db
) if it doesn’t exist.
- From the root directory, run:
- Access the Application
- UI: Visit http://localhost:3000 in your browser.
- API: Use the API at http://localhost:8000.
- Stop the Services
- Since
llmkit start
runs the servers in the foreground, pressCtrl+C
in the terminal to stop both the backend and frontend.
- Since
Additional Tips
- Environment Variables: Both methods require API keys for full functionality. Check
.env.example
for a list of supported providers. At minimum, set anOPENROUTER_API_KEY
to get started. - Database:
- In Docker Compose, the database is persisted via a Docker volume.
- In the binary method, the SQLite database (
llmkit.db
) is stored in thebackend
directory and managed automatically byllmkit start
.
- Development Mode (Optional):
- Want to tweak the code? Run the backend and frontend separately:
- Backend: In the
backend
directory, run:bashcargo run
- Frontend: In the
ui
directory, run:bashnpm run dev # or bun run dev
- Backend: In the
- Want to tweak the code? Run the backend and frontend separately:
Next Steps
With LLMKit running, you’re ready to:
- Explore the UI at http://localhost:3000 to manage prompts and settings.
- Use the API at http://localhost:8000 for programmatic access (it’s OpenAI-compatible!).
- Dive deeper with additional documentation (e.g., API Reference or Code Examples).
Enjoy building with LLMKit! If you run into issues, check the repository’s README or open an issue on GitHub. Happy prompting!