LLM Management
Introductionโ
This documentation provides comprehensive guidance on using the LLM Management dashboard, which allows super administrators to add, configure, and manage language models from various providers. The LLM Management dashboard is a critical component for customizing available AI models within the platform.
Table of Contentsโ
- LLM Management
Accessing LLM Managementโ
Navigation Pathโ
- Log in to the platform with super admin credentials
- Locate the left navigation bar
- Select Global Settings from the navigation options
- From the displayed options, click on LLM Management
Access Restrictionsโ
Important: LLM Management is accessible only to users with super admin privileges. Regular admin users or standard users will not see this option in their navigation menu.
Dashboard Overviewโ
Once you access the LLM Management section, you will see the main dashboard interface with the following key components:
Component | Description |
---|---|
Provider Section | Lists all available LLM providers (e.g., OpenAI, Anthropic, etc.) |
Configuration Controls | Buttons to enable/disable and configure provider settings |
Model Library Tabs | Tabs to switch between available models and your added models |
Action Buttons | Buttons for adding models and other management functions |
Provider Managementโ
Expanding Provider Optionsโ
To view available models from a specific provider:
- Locate the provider in the list (providers are typically organized alphabetically)
- Click on the dropdown arrow next to the provider name
- The dropdown will expand to reveal available models and configuration options
Adding Models to Libraryโ
To add a model from a provider to your model library:
- Expand the dropdown for the desired provider
- Locate and click the Add to Model Library button (appears as a "+" icon with text)
- A dialog box will appear requesting necessary credentials
- Enter the required API key for the selected provider
- Select the appropriate user group that should have access to this model
- Groups determine which users can access the model once added
- Click the Add button to complete the process
API Key Format Example:
OpenAI: sk-xxxxxxxxxxxxxxxxxxxxx
Anthropic: sk-ant-xxxxxxxxxxxxxxxxxxxxx
Note: Ensure you're using a valid API key with sufficient permissions and quota for the model you're adding.
Configuring Provider Visibilityโ
As a super admin, you can control which providers are visible to other administrators:
- Locate the Configuration button next to the provider
- Click the button to open provider visibility settings
- Toggle the visibility option to enable/disable the provider
- Confirm your changes
When a provider is disabled:
- It will not be visible to other admin users
- Models from this provider will not appear in selection options throughout the platform
- Existing implementations using these models will need alternative configurations
Model Library Managementโ
Viewing Added Modelsโ
To view all models that have been added to your platform:
- Click on the My Model Library tab at the top of the LLM Management dashboard
- This view displays all models that have been successfully added to your platform
The My Model Library view provides:
- A comprehensive list of all added models
- Details about each model including provider and configuration status
- Options to manage existing models
Foundation Modelsโ
The My Model Library includes both:
- Custom Added Models: Models you've added with your own API keys
- Foundation Models: Pre-configured models available to all administrators
Foundation models are:
- Pre-integrated into the platform
- Available without requiring additional API keys
- Managed by the platform provider
- Subject to platform licensing restrictions
Permissions and Access Controlโ
LLM Management provides granular control over which user groups can access specific models:
Access Level | Description |
---|---|
Super Admin | Can add, configure, and manage all aspects of LLM providers and models |
Group-Based Access | Models can be restricted to specific user groups during the addition process |
Visibility Control | Entire providers can be hidden or shown to different administrative users |
Troubleshootingโ
Common Issuesโ
Issue | Possible Solution |
---|---|
API Key Invalid | Verify the API key format and ensure it has not expired |
Model Not Appearing in Library | Refresh the page or check if the model was added to the correct group |
Provider Configuration Failed | Ensure your account has access to the specific model with the provider |
Access Denied | Verify you have super admin privileges required for LLM Management |
Support Resourcesโ
If you encounter persistent issues with the LLM Management dashboard:
- Check the platform documentation for updated instructions
- Contact your platform provider's support team
- Verify your license includes access to custom LLM configuration
Next Stepsโ
After configuring your LLM Management settings:
- Test the added models in your applications
- Configure model-specific parameters in your implementation
- Monitor usage and performance metrics
- Adjust configurations as needed based on performance and cost
This documentation is intended for super administrators responsible for configuring and managing language models within the platform. For standard user guides or implementation documentation, please refer to the relevant sections of the platform documentation.