Class: OllamaEmbedding
OllamaEmbedding is an alias for Ollama that implements the BaseEmbedding interface.
Hierarchy
-
↳
OllamaEmbedding
Implements
Constructors
constructor
• new OllamaEmbedding(params
): OllamaEmbedding
Parameters
Name | Type |
---|---|
params | OllamaParams |
Returns
Inherited from
Defined in
packages/core/src/llm/ollama.ts:57
Properties
embedBatchSize
• embedBatchSize: number
= DEFAULT_EMBED_BATCH_SIZE
Implementation of
Inherited from
Defined in
packages/core/src/embeddings/types.ts:11
hasStreaming
• Readonly
hasStreaming: true
Inherited from
Defined in
packages/core/src/llm/ollama.ts:45
model
• model: string
Inherited from
Defined in
packages/core/src/llm/ollama.ts:48
options
• options: Partial
<Omit
<Options
, "temperature"
| "top_p"
| "num_ctx"
>> & Pick
<Options
, "temperature"
| "top_p"
| "num_ctx"
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:50
Accessors
metadata
• get
metadata(): LLMMetadata
Returns
Inherited from
Ollama.metadata
Defined in
packages/core/src/llm/ollama.ts:68
Methods
chat
▸ chat(params
): Promise
<AsyncIterable
<ChatResponseChunk
>>
Get a chat response from the LLM
Parameters
Name | Type |
---|---|
params | LLMChatParamsStreaming <object , object > |
Returns
Promise
<AsyncIterable
<ChatResponseChunk
>>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:80
▸ chat(params
): Promise
<ChatResponse
<object
>>
Parameters
Name | Type |
---|---|
params | LLMChatParamsNonStreaming <object , object > |
Returns
Promise
<ChatResponse
<object
>>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:83
complete
▸ complete(params
): Promise
<AsyncIterable
<CompletionResponse
>>
Get a prompt completion from the LLM
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsStreaming |