Package dev.langchain4j.model.mistralai
Class MistralAiStreamingChatModel
java.lang.Object
dev.langchain4j.model.mistralai.MistralAiStreamingChatModel
- All Implemented Interfaces:
dev.langchain4j.model.chat.StreamingChatLanguageModel
public class MistralAiStreamingChatModel
extends Object
implements dev.langchain4j.model.chat.StreamingChatLanguageModel
Represents a Mistral AI Chat Model with a chat completion interface, such as mistral-tiny and mistral-small.
The model's response is streamed token by token and should be handled with
StreamingResponseHandler.
You can find description of parameters here.-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic class -
Constructor Summary
ConstructorsConstructorDescriptionMistralAiStreamingChatModel(String baseUrl, String apiKey, String modelName, Double temperature, Double topP, Integer maxTokens, Boolean safePrompt, Integer randomSeed, String responseFormat, Boolean logRequests, Boolean logResponses, Duration timeout) Constructs a MistralAiStreamingChatModel with the specified parameters. -
Method Summary
Modifier and TypeMethodDescriptionbuilder()voidgenerate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) Generates streamed token response based on the given list of messages and tool specification.voidgenerate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) Generates streamed token response based on the given list of messages.voidgenerate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) Generates streamed token response based on the given list of messages and tool specifications.static MistralAiStreamingChatModelwithApiKey(String apiKey) Creates a MistralAiStreamingChatModel with the specified API key.Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface dev.langchain4j.model.chat.StreamingChatLanguageModel
generate, generate
-
Constructor Details
-
MistralAiStreamingChatModel
public MistralAiStreamingChatModel(String baseUrl, String apiKey, String modelName, Double temperature, Double topP, Integer maxTokens, Boolean safePrompt, Integer randomSeed, String responseFormat, Boolean logRequests, Boolean logResponses, Duration timeout) Constructs a MistralAiStreamingChatModel with the specified parameters.- Parameters:
baseUrl- the base URL of the Mistral AI API. It uses the default value if not specifiedapiKey- the API key for authenticationmodelName- the name of the Mistral AI model to usetemperature- the temperature parameter for generating chat responsestopP- the top-p parameter for generating chat responsesmaxTokens- the maximum number of new tokens to generate in a chat responsesafePrompt- a flag indicating whether to use a safe prompt for generating chat responsesrandomSeed- the random seed for generating chat responses (if not specified, a random number is used)responseFormat- the response format for generating chat responses. Current values supported are "text" and "json_object".logRequests- a flag indicating whether to log raw HTTP requestslogResponses- a flag indicating whether to log raw HTTP responsestimeout- the timeout duration for API requests
-
-
Method Details
-
withApiKey
Creates a MistralAiStreamingChatModel with the specified API key.- Parameters:
apiKey- the API key for authentication- Returns:
- a MistralAiStreamingChatModel instance
-
generate
public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) Generates streamed token response based on the given list of messages and tool specifications.- Specified by:
generatein interfacedev.langchain4j.model.chat.StreamingChatLanguageModel- Parameters:
messages- the list of chat messagestoolSpecifications- the list of tool specifications. tool_choice is set to AUTO.handler- the response handler for processing the generated chat chunk responses
-
generate
public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) Generates streamed token response based on the given list of messages and tool specification.- Specified by:
generatein interfacedev.langchain4j.model.chat.StreamingChatLanguageModel- Parameters:
messages- the list of chat messagestoolSpecification- the tool specification. tool_choice is set to ANY.handler- the response handler for processing the generated chat chunk responses
-
generate
public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) Generates streamed token response based on the given list of messages.- Specified by:
generatein interfacedev.langchain4j.model.chat.StreamingChatLanguageModel- Parameters:
messages- the list of chat messageshandler- the response handler for processing the generated chat chunk responses
-
builder
-