Dynamic AI Model Manager

AI Agent SDK supports multiple AI model providers. Below are the currently available llm_config provider options:

  • OpenAI

  • Anthropic

  • Mistral

  • Fireworks

  • GoogleGenAI

  • Grok

  • Groq

  • VertexAI

Provider Enum

Here is an example to access the Crypto.com AI Agent SDK Provider enums:

from crypto_com_agent_client.lib.enums.provider_enum import Provider

for provider in Provider:
    print(provider.name, provider.value)
    
# Output:
# OpenAI OpenAI
# Anthropic Anthropic
# Mistral Mistral
# Fireworks Fireworks
# GoogleGenAI GoogleGenAI
# Grok Grok
# Groq Groq
# VertexAI VertexAI

Example: Using Crypto.com AI Agent with Gemini

Here is an example to use the Crypto.com AI Agent with Google's Gemini model for Cronos chain interactions:

from crypto_com_agent_client import Agent, SQLitePlugin
from crypto_com_agent_client.lib.enums.provider_enum import Provider

# Initialize the agent
agent = Agent.init(
   llm_config={
       "provider": Provider.GoogleGenAI,
       "model": "gemini-2.0-flash",
       "provider-api-key": "your-api-key",
       "temperature": "float-controlling-output-randomness",
   },
   blockchain_config={
       "chainId": "chain-id",
       "explorer-api-key": "your-api-key",
       "private-key": "your-private-key",
       "sso-wallet-url": "your-sso-wallet-url",
   },
   plugins={
       "storage": SQLitePlugin(db_path="agent_state.db"),
       "instructions": "You are an assistant that always provides useful responses."
   },
)


# Interaction
response = agent.interact("Get latest block height")
print(response)

Last updated

Was this helpful?