Class LLMChain

java.lang.Object
com.hw.langchain.chains.base.Chain
com.hw.langchain.chains.llm.LLMChain
Direct Known Subclasses:
ConversationChain

public class LLMChain extends Chain
Chain to run queries against LLMs
Author:
HamaWhite
  • Field Details

    • llm

      protected BaseLanguageModel llm
    • prompt

      protected BasePromptTemplate prompt
      Prompt object to use.
    • outputKey

      protected String outputKey
    • outputParser

      protected BaseLLMOutputParser<String> outputParser
      Output parser to use. Defaults to one that takes the most likely string but does not change it.
    • returnFinalOnly

      protected boolean returnFinalOnly
      Whether to return only the final parsed result. Defaults to true. If false, will return a bunch of extra information about the generation.
  • Constructor Details

  • Method Details

    • chainType

      public String chainType()
      Specified by:
      chainType in class Chain
    • inputKeys

      public List<String> inputKeys()
      Will be whatever keys the prompt expects.
      Specified by:
      inputKeys in class Chain
      Returns:
      the list of input keys
    • outputKeys

      public List<String> outputKeys()
      Will always return text key.
      Specified by:
      outputKeys in class Chain
      Returns:
      the list of output keys
    • innerCall

      protected Map<String,String> innerCall(Map<String,Object> inputs)
      Description copied from class: Chain
      Runs the logic of this chain and returns the output.
      Specified by:
      innerCall in class Chain
      Parameters:
      inputs - the inputs to be processed by the chain
      Returns:
      a map containing the output generated by the chain
    • asyncInnerCall

      protected reactor.core.publisher.Flux<Map<String,String>> asyncInnerCall(Map<String,Object> inputs)
      Description copied from class: Chain
      Runs the logic of this chain and returns the async output.
      Overrides:
      asyncInnerCall in class Chain
      Parameters:
      inputs - the inputs to be processed by the chain
      Returns:
      a map flux containing the output generated event by the chain
    • predict

      public String predict(Map<String,Object> kwargs)
      Format prompt with kwargs and pass to LLM.
      Parameters:
      kwargs - Keys to pass to prompt template.
      Returns:
      Completion from LLM.
    • asyncPredict

      public reactor.core.publisher.Flux<String> asyncPredict(Map<String,Object> kwargs)
      Format prompt with kwargs and pass to LLM async.
      Parameters:
      kwargs - Keys to pass to prompt template.
      Returns:
      Completion from LLM.
    • predictAndParse

      public <T> T predictAndParse(Map<String,Object> kwargs)
      Call predict and then parse the results.