Go back

LocalServerAIClient

SWIFT

LM Studio

LocalServerAIClient is a native iOS app that connects to a locally hosted, OpenAI-compatible AI server, allowing users to chat with selected models through a simple interface while keeping conversation history stored on the device.

LocalServerAIClient is a native iOS app that connects to a locally hosted, OpenAI-compatible AI server, allowing users to chat with selected models through a simple interface while keeping conversation history stored on the device. Built with SwiftUI and SwiftData, the app focuses on making self-hosted AI more accessible on mobile by combining a clean chat experience with flexible local server configuration. Users can set the server host and port, switch between HTTP and HTTPS, enable or disable response streaming, and refresh the list of available models directly from the app.

The client supports both standard and streaming chat completions, making responses feel either more immediate or more controlled depending on the selected mode. It also includes persistent multi-conversation management, so chats can be reopened later, reviewed, or deleted when no longer needed. As a portfolio project, it demonstrates practical iOS development skills in UI design, state management, local data persistence, asynchronous networking, and integration with OpenAI-style APIs for self-hosted AI workflows.