Package com.hw.langchain.llms.ollama
Class Ollama
java.lang.Object
com.hw.langchain.llms.base.BaseLLM
com.hw.langchain.llms.ollama.Ollama
- All Implemented Interfaces:
BaseLanguageModel
Ollama locally run large language models.
- Author:
- HamaWhite
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionprotected reactor.core.publisher.Flux<AsyncLLMResult>asyncInnerGenerate(List<String> prompts, List<String> stop) Run the LLM on the given prompts async.createStream(String prompt, List<String> stop) init()protected LLMResultinnerGenerate(List<String> prompts, List<String> stop) Call out to Ollama to generate endpoint.llmType()Return type of llm.static GenerationChunkstreamResponseToGenerationChunk(String streamResponse) Convert a stream response to a generation chunk.Methods inherited from class com.hw.langchain.llms.base.BaseLLM
asyncGeneratePrompt, asyncPredict, call, call, generate, generatePrompt, predict, predictMessagesMethods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface com.hw.langchain.base.language.BaseLanguageModel
asyncGeneratePrompt, asyncPredict, asyncPredictMessages, predict, predictMessages
-
Constructor Details
-
Ollama
public Ollama()
-
-
Method Details
-
llmType
Description copied from class:BaseLLMReturn type of llm. -
init
-
createStream
-
innerGenerate
Call out to Ollama to generate endpoint.- Specified by:
innerGeneratein classBaseLLM- Parameters:
prompts- The prompt to pass into the model.stop- list of stop words to use when generating.- Returns:
- The string generated by the model.
-
asyncInnerGenerate
protected reactor.core.publisher.Flux<AsyncLLMResult> asyncInnerGenerate(List<String> prompts, List<String> stop) Description copied from class:BaseLLMRun the LLM on the given prompts async.- Specified by:
asyncInnerGeneratein classBaseLLM
-
streamResponseToGenerationChunk
Convert a stream response to a generation chunk.- Parameters:
streamResponse- The stream response as a JSON string.- Returns:
- A GenerationChunk object containing the converted data.
-