Class LLM
java.lang.Object
com.hw.langchain.llms.base.BaseLLM
com.hw.langchain.chat.models.base.LLM
- All Implemented Interfaces:
BaseLanguageModel
- Direct Known Subclasses:
ChatGLM
Base LLM abstract class.
The purpose of this class is to expose a simpler interface for working
with LLMs, rather than expect the user to implement the full innerGenerate method.
- Author:
- HamaWhite
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionprotected reactor.core.publisher.Flux<AsyncLLMResult>
asyncInnerGenerate
(List<String> prompts, List<String> stop) Run the LLM on the given prompts async.abstract String
Run the LLM on the given prompt and input.protected LLMResult
innerGenerate
(List<String> prompts, List<String> stop) Run the LLM on the given prompts.Methods inherited from class com.hw.langchain.llms.base.BaseLLM
asyncGeneratePrompt, asyncPredict, call, call, generate, generatePrompt, llmType, predict, predictMessages
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface com.hw.langchain.base.language.BaseLanguageModel
asyncGeneratePrompt, asyncPredict, asyncPredictMessages, predict, predictMessages
-
Constructor Details
-
LLM
public LLM()
-
-
Method Details
-
innerCall
Run the LLM on the given prompt and input.- Parameters:
prompt
- The prompt to pass into the model.stop
- list of stop words to use when generating.- Returns:
- The string generated by the model.
-
innerGenerate
Description copied from class:BaseLLM
Run the LLM on the given prompts.- Specified by:
innerGenerate
in classBaseLLM
-
asyncInnerGenerate
protected reactor.core.publisher.Flux<AsyncLLMResult> asyncInnerGenerate(List<String> prompts, List<String> stop) Description copied from class:BaseLLM
Run the LLM on the given prompts async.- Specified by:
asyncInnerGenerate
in classBaseLLM
-