Class MistralAiStreamingChatModel

java.lang.Object
dev.langchain4j.model.mistralai.MistralAiStreamingChatModel
All Implemented Interfaces:
dev.langchain4j.model.chat.StreamingChatLanguageModel

public class MistralAiStreamingChatModel extends Object implements dev.langchain4j.model.chat.StreamingChatLanguageModel
Represents a Mistral AI Chat Model with a chat completion interface, such as mistral-tiny and mistral-small. The model's response is streamed token by token and should be handled with StreamingResponseHandler. You can find description of parameters here.
  • Nested Class Summary

    Nested Classes
    Modifier and Type
    Class
    Description
    static class 
     
  • Constructor Summary

    Constructors
    Constructor
    Description
    MistralAiStreamingChatModel(String baseUrl, String apiKey, String modelName, Double temperature, Double topP, Integer maxTokens, Boolean safePrompt, Integer randomSeed, String responseFormat, Boolean logRequests, Boolean logResponses, Duration timeout)
    Constructs a MistralAiStreamingChatModel with the specified parameters.
  • Method Summary

    Modifier and Type
    Method
    Description
     
    void
    generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler)
    Generates streamed token response based on the given list of messages and tool specification.
    void
    generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler)
    Generates streamed token response based on the given list of messages.
    void
    generate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler)
    Generates streamed token response based on the given list of messages and tool specifications.
    Creates a MistralAiStreamingChatModel with the specified API key.

    Methods inherited from class java.lang.Object

    clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait

    Methods inherited from interface dev.langchain4j.model.chat.StreamingChatLanguageModel

    generate, generate
  • Constructor Details

    • MistralAiStreamingChatModel

      public MistralAiStreamingChatModel(String baseUrl, String apiKey, String modelName, Double temperature, Double topP, Integer maxTokens, Boolean safePrompt, Integer randomSeed, String responseFormat, Boolean logRequests, Boolean logResponses, Duration timeout)
      Constructs a MistralAiStreamingChatModel with the specified parameters.
      Parameters:
      baseUrl - the base URL of the Mistral AI API. It uses the default value if not specified
      apiKey - the API key for authentication
      modelName - the name of the Mistral AI model to use
      temperature - the temperature parameter for generating chat responses
      topP - the top-p parameter for generating chat responses
      maxTokens - the maximum number of new tokens to generate in a chat response
      safePrompt - a flag indicating whether to use a safe prompt for generating chat responses
      randomSeed - the random seed for generating chat responses (if not specified, a random number is used)
      responseFormat - the response format for generating chat responses. Current values supported are "text" and "json_object".
      logRequests - a flag indicating whether to log raw HTTP requests
      logResponses - a flag indicating whether to log raw HTTP responses
      timeout - the timeout duration for API requests
  • Method Details

    • withApiKey

      public static MistralAiStreamingChatModel withApiKey(String apiKey)
      Creates a MistralAiStreamingChatModel with the specified API key.
      Parameters:
      apiKey - the API key for authentication
      Returns:
      a MistralAiStreamingChatModel instance
    • generate

      public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler)
      Generates streamed token response based on the given list of messages and tool specifications.
      Specified by:
      generate in interface dev.langchain4j.model.chat.StreamingChatLanguageModel
      Parameters:
      messages - the list of chat messages
      toolSpecifications - the list of tool specifications. tool_choice is set to AUTO.
      handler - the response handler for processing the generated chat chunk responses
    • generate

      public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler)
      Generates streamed token response based on the given list of messages and tool specification.
      Specified by:
      generate in interface dev.langchain4j.model.chat.StreamingChatLanguageModel
      Parameters:
      messages - the list of chat messages
      toolSpecification - the tool specification. tool_choice is set to ANY.
      handler - the response handler for processing the generated chat chunk responses
    • generate

      public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler)
      Generates streamed token response based on the given list of messages.
      Specified by:
      generate in interface dev.langchain4j.model.chat.StreamingChatLanguageModel
      Parameters:
      messages - the list of chat messages
      handler - the response handler for processing the generated chat chunk responses
    • builder