Class: ContextChatEngine
ContextChatEngine uses the Index to get the appropriate context for each query. The context is stored in the system prompt, and the chat history is chunk: ChatResponseChunk, nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[], nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[]lowing the appropriate context to be surfaced for each query.
Extends
Implements
Constructors
new ContextChatEngine()
new ContextChatEngine(
init
):ContextChatEngine
Parameters
• init
• init.chatHistory?: ChatMessage
[]
• init.chatModel?: LLM
<object
, object
>
• init.contextRole?: MessageType
• init.contextSystemPrompt?
• init.nodePostprocessors?: BaseNodePostprocessor
[]
• init.retriever: BaseRetriever
• init.systemPrompt?: string
Returns
Overrides
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:39
Properties
chatHistory
chatHistory:
ChatHistory
<object
>
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:35
chatModel
chatModel:
LLM
<object
,object
>
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:34
contextGenerator
contextGenerator:
ContextGenerator
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:36
systemPrompt?
optional
systemPrompt:string
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:37
Methods
_getPromptModules()
protected
_getPromptModules():Record
<string
,ContextGenerator
>
Returns
Record
<string
, ContextGenerator
>
Overrides
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:60
_getPrompts()
protected
_getPrompts():PromptsDict
Returns
PromptsDict
Inherited from
Defined in
packages/llamaindex/src/prompts/Mixin.ts:79
_updatePrompts()
protected
_updatePrompts(promptsDict
):void
Parameters
• promptsDict: PromptsDict
Returns
void
Inherited from
Defined in
packages/llamaindex/src/prompts/Mixin.ts:87
chat()
chat(params)
chat(
params
):Promise
<AsyncIterable
<EngineResponse
>>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsStreaming
Returns
Promise
<AsyncIterable
<EngineResponse
>>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:66
chat(params)
chat(
params
):Promise
<EngineResponse
>
Send message along with the class's current chat history to the LLM.
Parameters
• params: ChatEngineParamsNonStreaming
Returns
Promise
<EngineResponse
>
Implementation of
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:69
getPrompts()
getPrompts():
PromptsDict
Returns all prompts from the mixin and its modules
Returns
PromptsDict
Inherited from
Defined in
packages/llamaindex/src/prompts/Mixin.ts:27
reset()
reset():
void
Resets the chat history so that it's empty.
Returns
void
Implementation of
Defined in
packages/llamaindex/src/engines/chat/ContextChatEngine.ts:106
updatePrompts()
updatePrompts(
promptsDict
):void
Updates the prompts in the mixin and its modules
Parameters
• promptsDict: PromptsDict
Returns
void
Inherited from
Defined in
packages/llamaindex/src/prompts/Mixin.ts:48
validatePrompts()
validatePrompts(
promptsDict
,moduleDict
):void
Validates the prompt keys and module keys
Parameters
• promptsDict: PromptsDict
• moduleDict: ModuleDict
Returns
void