Class Ollama

java.lang.Object
com.hw.langchain.llms.base.BaseLLM
com.hw.langchain.llms.ollama.Ollama
All Implemented Interfaces:
BaseLanguageModel

public class Ollama extends BaseLLM
Ollama locally run large language models.
Author:
HamaWhite
  • Constructor Details

    • Ollama

      public Ollama()
  • Method Details

    • llmType

      public String llmType()
      Description copied from class: BaseLLM
      Return type of llm.
      Specified by:
      llmType in class BaseLLM
    • init

      public Ollama init()
    • createStream

      public List<String> createStream(String prompt, List<String> stop)
    • innerGenerate

      protected LLMResult innerGenerate(List<String> prompts, List<String> stop)
      Call out to Ollama to generate endpoint.
      Specified by:
      innerGenerate in class BaseLLM
      Parameters:
      prompts - The prompt to pass into the model.
      stop - list of stop words to use when generating.
      Returns:
      The string generated by the model.
    • asyncInnerGenerate

      protected reactor.core.publisher.Flux<AsyncLLMResult> asyncInnerGenerate(List<String> prompts, List<String> stop)
      Description copied from class: BaseLLM
      Run the LLM on the given prompts async.
      Specified by:
      asyncInnerGenerate in class BaseLLM
    • streamResponseToGenerationChunk

      public static GenerationChunk streamResponseToGenerationChunk(String streamResponse)
      Convert a stream response to a generation chunk.
      Parameters:
      streamResponse - The stream response as a JSON string.
      Returns:
      A GenerationChunk object containing the converted data.