v1.10.4

BYOLLM (Bring Your Own LLM) Improvements

  • Simplified BYOLLM configuration for organizations and individuals
  • Increased availability of OpenAI models
  • Added user-facing debugging & logging for BYOLLM errors
  • Rewrote BYOLLM networking stack to support more enterprise environments with custom proxies & certificates

Features

  • Improved MCP (Model Context Protocol) server installation and management (still beta - email support@quillmeetings.com for access)

🔧 Bug Fixes & Stability

  • Rewrote transcription pipeline for improved reliability and logging
  • Resolved payment method update errors in billing system
  • Improved reliability of weekly summary email generation on local computer