
๐ API Reference
Locentra OS exposes a full REST API built on FastAPI, enabling easy integration with external tools, scripts, or web UIs.
๐ Base URL:
http://localhost:8000
๐ Swagger Docs:
http://localhost:8000/docs
๐งญ API Structure
All endpoints are prefixed with:
/api/
๐ค POST /api/llm/query
/api/llm/query
Send a prompt to the model and receive a generated response.
๐งพ Request
{
"prompt": "Explain rollups in Ethereum"
}
โ Response
{
"response": "Rollups are a Layer 2 scaling solution that batch transactions..."
}
๐งช Curl Example
curl -X POST http://localhost:8000/api/llm/query \
-H "Content-Type: application/json" \
-d '{"prompt": "What is Solana?"}'
๐ POST /api/llm/train
/api/llm/train
Train the model on a new prompt โ completion pair.
๐งพ Request
{
"prompt": "What is slippage?",
"completion": "Slippage refers to the difference between expected and actual execution price."
}
โ Response
{
"status": "ok",
"message": "Training sample processed."
}
๐ง Optional Add-ons (via metadata)
Semantic vectorization of the prompt
Tagging (for filtered memory access)
Dry-run flag (simulate training without execution)
Use this endpoint to feed real-world data into your model โ one sample at a time.
๐ค POST /api/user/create
/api/user/create
Registers a new user.
๐งพ Request
{
"username": "alice"
}
Note:
โ Response
{
"username": "alice",
"api_key": "d3f3b38e-42aa-4c9c-9447-b5409b09f123"
}
Used internally by scripts and agents to track prompt provenance or session state.
๐ก GET /api/system/logs
/api/system/logs
Streams live logs from backend activity.
โ Response
[2025-05-19 13:44] Training started: 12 samples
[2025-05-19 13:45] Vector memory updated
[2025-05-19 13:46] Inference: Prompt received from CLI
Supports optional filtering by level and limit:
/api/system/logs?level=INFO&limit=100
๐ Authentication
Locentra OS ships with open API endpoints by default to support local-first development.
If you need access control:
๐ Add middleware in:
backend/api/middleware/auth.py
๐ Enable token-auth or key-auth in
server.py
๐งฑ Use Docker network isolation or NGINX ACLs for perimeter security
Locentra doesnโt lock you inโbut it gives you the hooks to lock things down.
๐งช Testing & Debugging
You can explore all endpoints via Swagger:
http://localhost:8000/docs
Or test manually with tools like:
curl
(command line)httpie
Postman / Insomnia
Python scripts using
requests
Every call hits the same backend used by the CLI and Web UI โ itโs all one engine.
Last updated