Package com.hw.langchain.base.language
Interface BaseLanguageModel
- All Known Implementing Classes:
BaseChatModel
,BaseLLM
,BaseOpenAI
,ChatGLM
,ChatOpenAI
,LLM
,Ollama
,OpenAI
,OpenAIChat
public interface BaseLanguageModel
BaseLanguageModel is an interface for interacting with a language model.
- Author:
- HamaWhite
-
Method Summary
Modifier and TypeMethodDescriptiondefault List<reactor.core.publisher.Flux<AsyncLLMResult>>
asyncGeneratePrompt
(List<PromptValue> prompts) Take in a list of prompt values and return an Fluxfor every PromptValue. default List<reactor.core.publisher.Flux<AsyncLLMResult>>
asyncGeneratePrompt
(List<PromptValue> prompts, List<String> stop) Take in a list of prompt values and return an Fluxfor every PromptValue. default reactor.core.publisher.Flux<String>
asyncPredict
(String text) Predict text from text async.default reactor.core.publisher.Flux<String>
asyncPredict
(String text, List<String> stop) Predict text from text async.default reactor.core.publisher.Flux<BaseMessage>
asyncPredictMessages
(List<BaseMessage> messages, List<String> stop) Predict message from messages async.generatePrompt
(List<PromptValue> prompts, List<String> stop) Take in a list of prompt values and return an LLMResult.default String
Predict text from text.Predict text from text.default BaseMessage
predictMessages
(List<BaseMessage> messages) Predict message from messages.predictMessages
(List<BaseMessage> messages, List<String> stop) Predict message from messages.
-
Method Details
-
generatePrompt
Take in a list of prompt values and return an LLMResult. -
predict
Predict text from text. -
predict
Predict text from text. -
predictMessages
Predict message from messages. -
predictMessages
Predict message from messages. -
asyncGeneratePrompt
default List<reactor.core.publisher.Flux<AsyncLLMResult>> asyncGeneratePrompt(List<PromptValue> prompts) Take in a list of prompt values and return an Fluxfor every PromptValue. -
asyncGeneratePrompt
default List<reactor.core.publisher.Flux<AsyncLLMResult>> asyncGeneratePrompt(List<PromptValue> prompts, List<String> stop) Take in a list of prompt values and return an Fluxfor every PromptValue. -
asyncPredict
Predict text from text async. -
asyncPredict
Predict text from text async. -
asyncPredictMessages
default reactor.core.publisher.Flux<BaseMessage> asyncPredictMessages(List<BaseMessage> messages, List<String> stop) Predict message from messages async.
-