Uses of Class
com.hw.langchain.schema.AsyncLLMResult
Packages that use AsyncLLMResult
Package
Description
-
Uses of AsyncLLMResult in com.hw.langchain.base.language
Methods in com.hw.langchain.base.language that return types with arguments of type AsyncLLMResultModifier and TypeMethodDescriptiondefault List<reactor.core.publisher.Flux<AsyncLLMResult>>BaseLanguageModel.asyncGeneratePrompt(List<PromptValue> prompts) Take in a list of prompt values and return an Fluxfor every PromptValue. default List<reactor.core.publisher.Flux<AsyncLLMResult>>BaseLanguageModel.asyncGeneratePrompt(List<PromptValue> prompts, List<String> stop) Take in a list of prompt values and return an Fluxfor every PromptValue. -
Uses of AsyncLLMResult in com.hw.langchain.chat.models.base
Methods in com.hw.langchain.chat.models.base that return types with arguments of type AsyncLLMResultModifier and TypeMethodDescriptionprotected reactor.core.publisher.Flux<AsyncLLMResult>LLM.asyncInnerGenerate(List<String> prompts, List<String> stop) -
Uses of AsyncLLMResult in com.hw.langchain.llms.base
Methods in com.hw.langchain.llms.base that return types with arguments of type AsyncLLMResultModifier and TypeMethodDescriptionList<reactor.core.publisher.Flux<AsyncLLMResult>>BaseLLM.asyncGeneratePrompt(List<PromptValue> prompts, List<String> stop) protected abstract reactor.core.publisher.Flux<AsyncLLMResult>BaseLLM.asyncInnerGenerate(List<String> prompts, List<String> stop) Run the LLM on the given prompts async. -
Uses of AsyncLLMResult in com.hw.langchain.llms.ollama
Methods in com.hw.langchain.llms.ollama that return types with arguments of type AsyncLLMResultModifier and TypeMethodDescriptionprotected reactor.core.publisher.Flux<AsyncLLMResult>Ollama.asyncInnerGenerate(List<String> prompts, List<String> stop) -
Uses of AsyncLLMResult in com.hw.langchain.llms.openai
Methods in com.hw.langchain.llms.openai that return types with arguments of type AsyncLLMResultModifier and TypeMethodDescriptionprotected reactor.core.publisher.Flux<AsyncLLMResult>BaseOpenAI.asyncInnerGenerate(List<String> prompts, List<String> stop) protected reactor.core.publisher.Flux<AsyncLLMResult>OpenAIChat.asyncInnerGenerate(List<String> prompts, List<String> stop)