Interface MistralAiRestApi


@Path("") @ClientHeaderParam(name="Authorization", value="Bearer {token}") @Consumes("application/json") @Produces("application/json") @RegisterProvider(MistralAiRestApiJacksonReader.class) @RegisterProvider(MistralAiRestApiJacksonWriter.class) @RegisterProvider(MistralAiRestApiWriterInterceptor.class) public interface MistralAiRestApi
This Microprofile REST client is used as the building block of all the API calls to MistralAI. The implementation is provided by the Reactive REST Client in Quarkus.
  • Nested Class Summary

    Nested Classes
    Modifier and Type
    Interface
    Description
    static class 
    Ensures that the terminal event sent by OpenAI is not processed (as it is not a valid json event)
    static class 
     
    static class 
     
    static class 
    The point of this is to properly set the stream value of the request so users don't have to remember to set it manually
    static class 
     
  • Method Summary

    Modifier and Type
    Method
    Description
    dev.langchain4j.model.mistralai.internal.api.MistralAiChatCompletionResponse
    blockingChatCompletion(dev.langchain4j.model.mistralai.internal.api.MistralAiChatCompletionRequest request, String token)
    Perform a blocking request for a completion response
    dev.langchain4j.model.mistralai.internal.api.MistralAiEmbeddingResponse
    embedding(dev.langchain4j.model.mistralai.internal.api.MistralAiEmbeddingRequest request, String token)
     
    dev.langchain4j.model.mistralai.internal.api.MistralAiModelResponse
    models(String token)
     
    io.smallrye.mutiny.Multi<dev.langchain4j.model.mistralai.internal.api.MistralAiChatCompletionResponse>
    streamingChatCompletion(dev.langchain4j.model.mistralai.internal.api.MistralAiChatCompletionRequest request, String token)
    Performs a non-blocking request for a streaming completion request
  • Method Details

    • blockingChatCompletion

      @Path("chat/completions") @POST dev.langchain4j.model.mistralai.internal.api.MistralAiChatCompletionResponse blockingChatCompletion(dev.langchain4j.model.mistralai.internal.api.MistralAiChatCompletionRequest request, @NotBody String token)
      Perform a blocking request for a completion response
    • streamingChatCompletion

      @Path("chat/completions") @POST @RestStreamElementType("application/json") io.smallrye.mutiny.Multi<dev.langchain4j.model.mistralai.internal.api.MistralAiChatCompletionResponse> streamingChatCompletion(dev.langchain4j.model.mistralai.internal.api.MistralAiChatCompletionRequest request, @NotBody String token)
      Performs a non-blocking request for a streaming completion request
    • embedding

      @Path("embeddings") @POST dev.langchain4j.model.mistralai.internal.api.MistralAiEmbeddingResponse embedding(dev.langchain4j.model.mistralai.internal.api.MistralAiEmbeddingRequest request, @NotBody String token)
    • models

      @Path("models") @GET dev.langchain4j.model.mistralai.internal.api.MistralAiModelResponse models(@NotBody String token)