Uses of Class
com.hw.langchain.schema.AsyncLLMResult
Package
Description
-
Uses of AsyncLLMResult in com.hw.langchain.base.language
Modifier and TypeMethodDescriptiondefault List<reactor.core.publisher.Flux<AsyncLLMResult>>
BaseLanguageModel.asyncGeneratePrompt
(List<PromptValue> prompts) Take in a list of prompt values and return an Fluxfor every PromptValue. default List<reactor.core.publisher.Flux<AsyncLLMResult>>
BaseLanguageModel.asyncGeneratePrompt
(List<PromptValue> prompts, List<String> stop) Take in a list of prompt values and return an Fluxfor every PromptValue. -
Uses of AsyncLLMResult in com.hw.langchain.chat.models.base
Modifier and TypeMethodDescriptionprotected reactor.core.publisher.Flux<AsyncLLMResult>
LLM.asyncInnerGenerate
(List<String> prompts, List<String> stop) -
Uses of AsyncLLMResult in com.hw.langchain.llms.base
Modifier and TypeMethodDescriptionList<reactor.core.publisher.Flux<AsyncLLMResult>>
BaseLLM.asyncGeneratePrompt
(List<PromptValue> prompts, List<String> stop) protected abstract reactor.core.publisher.Flux<AsyncLLMResult>
BaseLLM.asyncInnerGenerate
(List<String> prompts, List<String> stop) Run the LLM on the given prompts async. -
Uses of AsyncLLMResult in com.hw.langchain.llms.ollama
Modifier and TypeMethodDescriptionprotected reactor.core.publisher.Flux<AsyncLLMResult>
Ollama.asyncInnerGenerate
(List<String> prompts, List<String> stop) -
Uses of AsyncLLMResult in com.hw.langchain.llms.openai
Modifier and TypeMethodDescriptionprotected reactor.core.publisher.Flux<AsyncLLMResult>
BaseOpenAI.asyncInnerGenerate
(List<String> prompts, List<String> stop) protected reactor.core.publisher.Flux<AsyncLLMResult>
OpenAIChat.asyncInnerGenerate
(List<String> prompts, List<String> stop)