Dynamic AI Model Manager
AI Agent SDK supports multiple AI model providers. Below are the currently available llm_config provider options:
OpenAI
Anthropic
Mistral
Fireworks
GoogleGenAI
Grok
Groq
VertexAI
AWS Bedrock
Provider Enum
Here is an example to access the Crypto.com AI Agent SDK Provider enums:
from crypto_com_agent_client.lib.enums.provider_enum import Provider
for provider in Provider:
print(provider.name, provider.value)
# Output:
# OpenAI OpenAI
# Anthropic Anthropic
# Mistral Mistral
# Fireworks Fireworks
# GoogleGenAI GoogleGenAI
# Grok Grok
# Groq Groq
# VertexAI VertexAI
# Bedrock BedrockExample: Using Crypto.com AI Agent with Gemini
Here is an example to use the Crypto.com AI Agent with Google's Gemini model for Cronos chain interactions:
Last updated
Was this helpful?