Interface Langchain4jOllamaConfig.OllamaConfig
- Enclosing interface:
- Langchain4jOllamaConfig
public static interface Langchain4jOllamaConfig.OllamaConfig
-
Method Summary
Modifier and TypeMethodDescriptionbaseUrl()Base URL where the Ollama serving is runningChat model related settingsEmbedding model related settingsWhether or not to enable the integration.Whether the Ollama client should log requestsWhether the Ollama client should log responsestimeout()Timeout for Ollama calls
-
Method Details
-
baseUrl
Base URL where the Ollama serving is running -
timeout
Timeout for Ollama calls -
logRequests
Whether the Ollama client should log requests -
logResponses
Whether the Ollama client should log responses -
enableIntegration
Whether or not to enable the integration. Defaults totrue, which means requests are made to the OpenAI provider. Set tofalseto disable all requests. -
chatModel
ChatModelConfig chatModel()Chat model related settings -
embeddingModel
EmbeddingModelConfig embeddingModel()Embedding model related settings
-