Skip to main content

LLM Model Usage

Ollama provider

Default ollama server address: http://127.0.0.1:11434 and not need for api key

1734601174894

openai provider

you can use openai provider to access all openai models, such as gpt-4, gpt-4o, gpt-4o-mini, etc. default openai server address: https://api.openai.com and need for api key when you use other proxy transmit provider, please change api url

1734601215081

anthropic provider

you can use anthropic provider to access all anthropic models such as claude, sonnet, etc. default anthropic server address: https://api.anthropic.com and need for api key when you use other proxy transmit provider, please change api url

1734601215081

siliconflow provider

you can use siliconflow provider to access all siliconflow models such as Qwen/Qwen2.5-7B-Instruct, THUDM/glm-4-9b-chat. default siliconflow server address: https://api.siliconflow.ai and need for api key it use openai official python sdk to access

1734601215081

openai-api-compatible provider

Also support openai api compatible, you can use openai official python sdk to access. You need to set api key and api url and a custom name for provider.

1734601588351