Uses of Interface
org.anchoranalysis.inference.InferenceModel
Packages that use InferenceModel
Package
Description
Specifying how many CPUs and GPUs can be allocated for some purpose.
-
Uses of InferenceModel in org.anchoranalysis.inference.concurrency
Classes in org.anchoranalysis.inference.concurrency with type parameters of type InferenceModelModifier and TypeClassDescriptionfinal classConcurrentModel<T extends InferenceModel>An instance of model that can be used concurrently for inference.classConcurrentModelPool<T extends InferenceModel>Keeps concurrent copies of a model to be used by different threads.interfaceCreateModelForPool<T extends InferenceModel>Creates a model to use in the pool.