✨ PeerLLM Orchestrator API

Endpoints for managing conversations across distributed LLM hosts.

POST /api/chats/start

Start a new conversation with a host supporting the specified model.

Request Body

{
  "modelName": "mistral-7b"
}

Responses

POST /api/chats/stream

Stream responses from the LLM for a given prompt. Results are returned as a sequence of text chunks.

Request Body

{
  "conversationId": "f2a1c4e9-9d8e-4a88-bb3b-73c71f65a1df",
  "text": "Write a haiku about AI."
}

Responses

POST /api/chats/end

End an active conversation and release resources.

Request Body

"f2a1c4e9-9d8e-4a88-bb3b-73c71f65a1df"

Responses