Interface EmbeddingModelConfig
public interface EmbeddingModelConfig
-
Method Details
-
modelId
Model to use. According to Ollama docs, the default value isnomic-embed-text -
temperature
The temperature of the model. Increasing the temperature will make the model answer with more variability. A lower temperature will make the model answer more conservatively. -
numPredict
Maximum number of tokens to predict when generating text -
stop
Sets the stop sequences to use. When this pattern is encountered the LLM will stop generating text and return -
topP
Works together with top-k. A higher value (e.g., 0.95) will lead to more diverse text, while a lower value (e.g., 0.5) will generate more focused and conservative text -
topK
Reduces the probability of generating nonsense. A higher value (e.g. 100) will give more diverse answers, while a lower value (e.g. 10) will be more conservative
-