All Classes and Interfaces

Class
Description
Chat message entity
Chat role define
Completion result entity
 
Batch decode exception
Token generate status
Generate parameter
Mirostat sampling mode define
Generation exception
Model inference generator, Supports streaming output text and generating complete text.
Llama context params entity
Llama model params entity
Llama RoPE scaling type define
Llama.cpp API
Llama token type define
Customize a processor to adjust the probability distribution of words and control the generation of model inference results.
Stopping criteria list
 
Generation metrics
LLama model, which provides functions for generating and chatting conversations.
Model exception
Llama model parameters
Model type
 
 
Prompt builder
 
Customize a controller to implement stop rule control for model inference.
Stopping criteria list
Token
Token decoder