Skip to main content

Type Alias: OpenAIAdditionalChatOptions

OpenAIAdditionalChatOptions: Omit<Partial<OpenAILLM.Chat.ChatCompletionCreateParams>, "max_tokens" | "messages" | "model" | "temperature" | "top_p" | "stream" | "tools" | "toolChoice">

Defined in

packages/llamaindex/src/llm/openai.ts:147