Skip to content
Snippets Groups Projects
Select Git revision
  • 520956-skip-entry-owners-check-for-exclusion-pattern
  • pw-refactor-pipeline-sd-config
  • 519729-remove-ff
  • 521696-authenticate-header
  • selhorn-prereq-move
  • saf-add-policy-rules-for-organizational-cluster-agents
  • 510317-implement-feature-flag
  • 521410-update-code-creation-slash-commands-to-claude-3-7-sonnet
  • 517763-be-track-which-permissions-are-necessary-for-a-job-to-execute
  • 514593-sharding-key-resource-milestone-events
  • split_sbom_file_into_multiple_reports
  • charlieeekroon/515827-display-on-hover
  • master default protected
  • jc/local-board-experiment
  • kkloss-workhorse-expose-correlation-id-overload
  • move-member-approval-to-ee
  • chore/update-elasticsearch-developer-docs
  • oregand-master-patch-0bb1
  • 514785-custom-fields-widget-definition
  • id-define-amazon-q-integration-ui
  • v17.7.6-ee protected
  • v17.8.4-ee protected
  • v17.9.1-ee protected
  • v17.8.3-ee protected
  • v17.7.5-ee protected
  • v17.9.0-ee protected
  • v17.9.0-rc42-ee protected
  • v17.6.5-ee protected
  • v17.7.4-ee protected
  • v17.8.2-ee protected
  • v17.6.4-ee protected
  • v17.7.3-ee protected
  • v17.8.1-ee protected
  • v17.8.0-ee protected
  • v17.7.2-ee protected
  • v17.8.0-rc42-ee protected
  • v17.5.5-ee protected
  • v17.6.3-ee protected
  • v17.7.1-ee protected
  • v17.7.0-ee protected
40 results

ai_dependent.rb

249
ai_dependent.rb 2.09 KiB
# frozen_string_literal: true

module Gitlab
  module Llm
    module Chain
      module Concerns
        module AiDependent
          include ::Gitlab::Llm::Concerns::Logger

          def prompt
            provider_prompt_class.prompt(prompt_options)
          end

          def request(&block)
            prompt_str = prompt
            prompt_text = prompt_str[:prompt]

            log_conditional_info(context.current_user,
              message: "Content of the prompt from chat request",
              event_name: 'prompt_content',
              ai_component: 'duo_chat',
              prompt: prompt_text)

            if use_ai_gateway_agent_prompt?
              prompt_str[:options] ||= {}
              prompt_str[:options].merge!({
                use_ai_gateway_agent_prompt: true,
                inputs: prompt_options
              })
            end

            ai_request.request(prompt_str, unit_primitive: unit_primitive, &block)
          end

          def streamed_request_handler(streamed_answer)
            proc do |content|
              next unless stream_response_handler

              chunk = streamed_answer.next_chunk(content)

              if chunk
                stream_response_handler.execute(
                  response: Gitlab::Llm::Chain::StreamedResponseModifier
                              .new(streamed_content(content, chunk), chunk_id: chunk[:id]),
                  options: { chunk_id: chunk[:id] }
                )
              end
            end
          end

          private

          def ai_request
            context.ai_request
          end

          def provider_prompt_class
            ai_provider_name = ai_request.class.name.demodulize.underscore.to_sym

            self.class::PROVIDER_PROMPT_CLASSES[ai_provider_name]
          end

          def unit_primitive
            nil
          end

          def use_ai_gateway_agent_prompt?
            false
          end
          # This method is modified in SingleActionExecutor for Duo Chat
          def streamed_content(content, _chunk)
            content
          end
        end
      end
    end
  end
end