Skip to content

Use text-bison-32k for longer input limit

Tan Le requested to merge duo-chat-evaluate-text-bison-32k into main

What does this merge request do and why?

Use text-bison-32k for longer input limit.

This fixes token limit issues when answering and evaluating input with large context, such as issue/epic tasks.

Relates to #197 (closed)

How to set up and validate locally

Numbered steps to set up and validate the change are strongly suggested.

Merge request checklist

  • I've ran the affected pipeline(s) to validate that nothing is broken.
    • dev-ai-research-0e2f8974:duo_chat_experiments.tl_code_explanation_text_bison_32k_20240320_161400__independent_llm_judge
    • dev-ai-research-0e2f8974:duo_chat_experiments.tl_code_explanation_text_bison_32k_20240320_161400__similarity_score
  • Tests added for new functionality. If not, please raise an issue to follow up.
  • Documentation added/updated, if needed.
Edited by Tan Le

Merge request reports