Interface defining the input to the ZhipuAIChatInput class.

interface ChatZhipuAIParams {
    modelName: ModelName;
    doSample?: boolean;
    maxTokens?: number;
    messages?: ZhipuMessage[];
    requestId?: string;
    stop?: string[];
    streaming?: boolean;
    temperature?: number;
    topP?: number;
    zhipuAIApiKey?: string;
}

Implemented by

Properties

modelName: ModelName

Default

"glm-3-turbo"
doSample?: boolean

turn on sampling strategy when do_sample is true, do_sample is false, temperature、top_p will not take effect

maxTokens?: number

max value is 8192,defaults to 1024

messages?: ZhipuMessage[]

Messages to pass as a prefix to the prompt

requestId?: string

Unique identifier for the request. Defaults to a random UUID.

stop?: string[]
streaming?: boolean

Whether to stream the results or not. Defaults to false.

temperature?: number

Amount of randomness injected into the response. Ranges from 0 to 1 (0 is not included). Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks. Defaults to 0.95

topP?: number

Total probability mass of tokens to consider at each step. Range from 0 to 1 Defaults to 0.7

zhipuAIApiKey?: string

API key to use when making requests. Defaults to the value of ZHIPUAI_API_KEY environment variable.

Generated using TypeDoc