Apollo AI: Private & Local AI lets you chat with AI models when offline by connecting to your locally hosted LLM or using OpenAI. It's a client for accessing language models from various sources.
Allows you to connect to your privately hosted Language Model (LLM) backend and chat without needing an internet connection.
Provides the capability to chat through OpenAI's API for a seamless AI interaction experience.
Connects to locally hosted private LLMs, offering the flexibility to work with AI without relying on cloud services.
Acts as a customizable client for accessing language models, giving users control over AI interactions and preferences.