Description
Nous Hermes 2 Mixtral 8x7B DPO is the new flagship Nous Research model trained over the [Mixtral 8x7B MoE LLM](/models/mistralai/mixtral-8x7b). The model was trained on over 1,000,000 entries of primarily [GPT-4](/models/openai/gpt-4) generated data, as well as other high quality data from open datasets across the AI landscape, achieving state of the art performance on a variety of tasks. #moe
API Usage Examples
OpenAI Compatible Endpoint
Use this endpoint with any OpenAI-compatible library. Model: Nous: Hermes 2 Mixtral 8x7B DPO (nousresearch/nous-hermes-2-mixtral-8x7b-dpo)
curl https://api.ridvay.com/v1/chat/completions -H "Content-Type: application/json" -H "Authorization: Bearer YOUR_API_KEY" -d '{
"model": "nousresearch/nous-hermes-2-mixtral-8x7b-dpo",
"messages": [
{
"role": "user",
"content": "Explain the capabilities of the Nous: Hermes 2 Mixtral 8x7B DPO model"
}
],
"temperature": 0.7,
"max_tokens": 1024
}'
Supported Modalities
- Text
API Pricing
- Input: 0.6$ / 1M tokens
- Output: 0.6$ / 1M tokens
Token Limits
- Max Output: 2,048 tokens
- Max Context: 32,768 tokens
Subscription Tiers
- free
- pro
- ultimate