Screenshots¶
Visual documentation of Convoscope's interface and functionality.
Application Interface¶
Main Chat Interface¶

Features shown: - Multi-provider LLM selection - Dropdown for OpenAI, Anthropic Claude, and Google Gemini - Clean, responsive chat interface - Professional Streamlit-based design - Real-time conversation display - Smooth message flow with user and assistant messages - Configuration sidebar - Temperature, token limits, and system prompt controls - Conversation management - Save, load, and export functionality - Provider status indicators - Visual feedback on active provider and model
Key Capabilities Demonstrated¶
The screenshot captures the core functionality of Convoscope:
- Multi-Provider Architecture - Seamless switching between OpenAI, Anthropic, and Google Gemini
- Intelligent Fallback - Automatic provider switching when failures occur
- Conversation Persistence - Save and load conversation history
- Professional UI/UX - Clean, modern interface suitable for production use
- Responsive Design - Works on desktop and mobile devices
- Error Resilience - Graceful handling of API failures and edge cases
Additional Screenshots (Coming Soon)¶
The following screenshots are being updated to reflect the latest interface improvements:
- Provider Selection Dropdown - Detailed view of multi-provider options
- Full Application View - Complete layout with all UI components
- Error Handling - User-friendly error messages and recovery flows
- Mobile Interface - Responsive design on mobile viewports
- Compare View - Side-by-side model comparison interface
- Results Viewer - Experiment logs with filters and export options
See the Visual Assets Index for the complete screenshot roadmap.
Screenshot captured from live application running on macOS with Chrome browser