Interface LangChain4jOpenshiftAiConfig.OpenshiftAiConfig
- Enclosing interface:
- LangChain4jOpenshiftAiConfig
public static interface LangChain4jOpenshiftAiConfig.OpenshiftAiConfig
-
Method Summary
Modifier and TypeMethodDescriptionbaseUrl()Base URL where OpenShift AI serving is running, such ashttps://flant5s-l-predictor-ch2023.apps.cluster-hj2qv.dynamic.redhatworkshops.io:443/apiChat model related settingsWhether to enable the integration.Whether the OpenShift AI client should log requestsWhether the OpenShift AI client should log responsestimeout()Timeout for OpenShift AI calls
-
Method Details
-
baseUrl
Base URL where OpenShift AI serving is running, such ashttps://flant5s-l-predictor-ch2023.apps.cluster-hj2qv.dynamic.redhatworkshops.io:443/api -
timeout
@ConfigDocDefault("10s") @WithDefault("${quarkus.langchain4j.timeout}") Optional<Duration> timeout()Timeout for OpenShift AI calls -
logRequests
@ConfigDocDefault("false") @WithDefault("${quarkus.langchain4j.log-requests}") Optional<Boolean> logRequests()Whether the OpenShift AI client should log requests -
logResponses
@ConfigDocDefault("false") @WithDefault("${quarkus.langchain4j.log-responses}") Optional<Boolean> logResponses()Whether the OpenShift AI client should log responses -
enableIntegration
Whether to enable the integration. Defaults totrue, which means requests are made to the OpenAI provider. Set tofalseto disable all requests. -
chatModel
ChatModelConfig chatModel()Chat model related settings
-