Skip to content

Browser

This guide explains how to integrate OpenWebUI with browser extensions and other applications using the OpenWebUI API.

OpenWebUI provides a powerful API that allows you to integrate its capabilities with various browser extensions and applications. This enables you to:

  • Use OpenWebUI’s models through your favorite browser extensions
  • Access document analysis and knowledge search from other applications
  • Create custom integrations for your workflow

Before integrating with browser extensions, you’ll need to:

  1. Get your API key from OpenWebUI:

    • Go to Settings > Account
    • Generate an API key or use your JWT token
  2. Configure your OpenWebUI instance:

    • Ensure your instance is accessible via HTTPS
    • Note your instance’s base URL (e.g., https://ollama.candesic.net)

ChatGPTBox is a popular browser extension that can be configured to use OpenWebUI’s API.

  1. Install ChatGPTBox from the Chrome Web Store, Firefox Add-ons, or Microsoft Edge Add-ons

  2. Configure the extension:

    • Open ChatGPTBox settings
    • Select “Custom API” as the API provider
    • Enter your OpenWebUI endpoint: https://ollama.candesic.net/api/chat/completions
    • Add your API key in the Authorization header
    • Select your preferred model from the available options

Once configured, you can:

  • Use the extension’s floating chat interface
  • Access OpenWebUI’s models through the extension
  • Utilize document analysis features
  • Perform knowledge searches

OpenWebUI provides several API endpoints that can be used for integration:

Terminal window
POST /api/chat/completions

Use this endpoint for chat interactions with models.

Terminal window
POST /api/v1/files/

Upload and analyze documents through the API.

Terminal window
POST /api/v1/knowledge/{id}/file/add

Add files to knowledge collections for enhanced search capabilities.

Here’s an example of how to configure ChatGPTBox to use OpenWebUI:

  1. API Configuration:

    Base URL: https://ollama.candesic.net
    Endpoint: /api/chat/completions
    Authorization: Bearer YOUR_API_KEY
  2. Request Format:

    {
    "model": "llama3.1",
    "messages": [
    {
    "role": "user",
    "content": "Your question here"
    }
    ]
    }
  1. Security:

    • Always use HTTPS for API connections
    • Keep your API keys secure
    • Regularly rotate your API keys
  2. Performance:

    • Cache responses when appropriate
    • Use streaming for long responses
    • Monitor API usage and rate limits
  3. Error Handling:

    • Implement proper error handling
    • Provide fallback options
    • Log errors for debugging

Common issues and solutions:

  1. Authentication Errors:

    • Verify your API key is correct
    • Check if the key has proper permissions
    • Ensure the Authorization header is properly formatted
  2. Connection Issues:

    • Verify your OpenWebUI instance is accessible
    • Check network connectivity
    • Ensure CORS is properly configured
  3. Model Selection:

    • Verify the model name is correct
    • Check if the model is available on your instance
    • Ensure you have proper access to the model