Class LLM
java.lang.Object
com.hw.langchain.llms.base.BaseLLM
com.hw.langchain.chat.models.base.LLM
- All Implemented Interfaces:
BaseLanguageModel
- Direct Known Subclasses:
ChatGLM
Base LLM abstract class.
The purpose of this class is to expose a simpler interface for working
with LLMs, rather than expect the user to implement the full innerGenerate method.
- Author:
- HamaWhite
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionprotected reactor.core.publisher.Flux<AsyncLLMResult>asyncInnerGenerate(List<String> prompts, List<String> stop) Run the LLM on the given prompts async.abstract StringRun the LLM on the given prompt and input.protected LLMResultinnerGenerate(List<String> prompts, List<String> stop) Run the LLM on the given prompts.Methods inherited from class com.hw.langchain.llms.base.BaseLLM
asyncGeneratePrompt, asyncPredict, call, call, generate, generatePrompt, llmType, predict, predictMessagesMethods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface com.hw.langchain.base.language.BaseLanguageModel
asyncGeneratePrompt, asyncPredict, asyncPredictMessages, predict, predictMessages
-
Constructor Details
-
LLM
public LLM()
-
-
Method Details
-
innerCall
Run the LLM on the given prompt and input.- Parameters:
prompt- The prompt to pass into the model.stop- list of stop words to use when generating.- Returns:
- The string generated by the model.
-
innerGenerate
Description copied from class:BaseLLMRun the LLM on the given prompts.- Specified by:
innerGeneratein classBaseLLM
-
asyncInnerGenerate
protected reactor.core.publisher.Flux<AsyncLLMResult> asyncInnerGenerate(List<String> prompts, List<String> stop) Description copied from class:BaseLLMRun the LLM on the given prompts async.- Specified by:
asyncInnerGeneratein classBaseLLM
-