Secure and scalable API for local LLM inference
The first registered user automatically becomes an administrator with full access.
Admin LoginTo use the API, include your API key in the requests:
curl -X POST "http://your-server/v1/completions" \
-H "Content-Type: application/json" \
-d '{
"model": "your-model",
"prompt": "Your prompt here",
"max_tokens": 100,
"api_key": "your-api-key"
}'