Package com.hw.langchain.llms.ollama
Class Ollama
java.lang.Object
com.hw.langchain.llms.base.BaseLLM
com.hw.langchain.llms.ollama.Ollama
- All Implemented Interfaces:
BaseLanguageModel
Ollama locally run large language models.
- Author:
- HamaWhite
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionprotected reactor.core.publisher.Flux<AsyncLLMResult>
asyncInnerGenerate
(List<String> prompts, List<String> stop) Run the LLM on the given prompts async.createStream
(String prompt, List<String> stop) init()
protected LLMResult
innerGenerate
(List<String> prompts, List<String> stop) Call out to Ollama to generate endpoint.llmType()
Return type of llm.static GenerationChunk
streamResponseToGenerationChunk
(String streamResponse) Convert a stream response to a generation chunk.Methods inherited from class com.hw.langchain.llms.base.BaseLLM
asyncGeneratePrompt, asyncPredict, call, call, generate, generatePrompt, predict, predictMessages
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface com.hw.langchain.base.language.BaseLanguageModel
asyncGeneratePrompt, asyncPredict, asyncPredictMessages, predict, predictMessages
-
Constructor Details
-
Ollama
public Ollama()
-
-
Method Details
-
llmType
Description copied from class:BaseLLM
Return type of llm. -
init
-
createStream
-
innerGenerate
Call out to Ollama to generate endpoint.- Specified by:
innerGenerate
in classBaseLLM
- Parameters:
prompts
- The prompt to pass into the model.stop
- list of stop words to use when generating.- Returns:
- The string generated by the model.
-
asyncInnerGenerate
protected reactor.core.publisher.Flux<AsyncLLMResult> asyncInnerGenerate(List<String> prompts, List<String> stop) Description copied from class:BaseLLM
Run the LLM on the given prompts async.- Specified by:
asyncInnerGenerate
in classBaseLLM
-
streamResponseToGenerationChunk
Convert a stream response to a generation chunk.- Parameters:
streamResponse
- The stream response as a JSON string.- Returns:
- A GenerationChunk object containing the converted data.
-