Back to Projects

AI Companion

Self-hosted AI chatbot engine with offline execution, CUDA/Metal acceleration, and real-time learning capabilities.

AI Companion

About This Project

AI Companion is a self-hosted AI chatbot engine designed for complete offline execution with support for both CPU and GPU processing. The platform supports popular models like Llama 2 and Mistral in .gguf format, with CUDA and Metal acceleration for optimal performance.



Key features include roleplay syntax support (*actions*), a dual memory system (short and long-term), and real-time learning capabilities. The entire application is packaged as a compact 26MB Rust binary with an integrated web UI, making it easy to deploy and use.

Key Features

  • Offline execution (CPU/GPU)
  • CUDA and Metal acceleration
  • Llama 2/Mistral model support
  • Roleplay syntax support
  • Dual memory system
  • Real-time learning
  • Compact 26MB binary

Challenges

  • Implementing efficient offline AI processing
  • Optimizing for multiple hardware accelerations
  • Creating intuitive web UI for complex AI features

Results

  • Fully self-hosted AI solution
  • High performance across different hardware
  • User-friendly interface for complex AI operations

Technologies Used