Interface LangChain4jOllamaConfig.OllamaConfig

Enclosing interface:
LangChain4jOllamaConfig

public static interface LangChain4jOllamaConfig.OllamaConfig
  • Method Details

    • baseUrl

      @WithDefault("http://localhost:11434") String baseUrl()
      Base URL where the Ollama serving is running
    • timeout

      @WithDefault("10s") Duration timeout()
      Timeout for Ollama calls
    • logRequests

      @ConfigDocDefault("false") @WithDefault("${quarkus.langchain4j.log-requests}") Optional<Boolean> logRequests()
      Whether the Ollama client should log requests
    • logResponses

      @ConfigDocDefault("false") @WithDefault("${quarkus.langchain4j.log-responses}") Optional<Boolean> logResponses()
      Whether the Ollama client should log responses
    • enableIntegration

      @WithDefault("true") Boolean enableIntegration()
      Whether to enable the integration. Defaults to true, which means requests are made to the OpenAI provider. Set to false to disable all requests.
    • chatModel

      ChatModelConfig chatModel()
      Chat model related settings
    • embeddingModel

      EmbeddingModelConfig embeddingModel()
      Embedding model related settings