Skip to content

Draft: Rough Draft of Conversational CI backend

Allison Browne requested to merge ab-ci-endpoint-mvc into master

What does this MR do and why?

This MR introduces a graphql mutation and query field.

The first mutation takes the message content from the user and sends it to the Open ai endpoint. It then persists the messages from the user and the assistant to the database.

The second query field allows the frontend to poll for the entire conversation as paginated via graphql.

Issue: https://gitlab.com/gitlab-org/gitlab/-/issues/407739

Screenshots or screen recordings

How to set up and validate locally

  1. Enable the feature flag for this feature ai_ci_config_creator
    Feature.enable(:ai_ci_config_generator)
  2. Enable the :openai_experimentation feature flag that the llm_client is behind
    Feature.enable(:openai_experimentation)
  3. Run rails c then setup the api key locally. You have to request one: https://gitlab.slack.com/archives/C0517A9PZ8S/p1681445186381439
    Gitlab::CurrentSettings.update(openai_api_key: "<your-key>")
  4. Then navigate to your graphql explorer. For instance localhost:3000/-/graphql-explorer
  5. Run the mutation
    mutation {
      ciAiGenerateConfig(input: {
        projectPath: "root/cache-test", 
        userContent: "How can I deploy a python application?"
      }
      ) {
        errors
        userMessage {
          id,
          role,
          content,
        }
      } 
    }
  6. Query for conversation messages
     query q1 { project(fullPath: "root/cache-test") {
         aiConversations {
           ciConfigMessages {
             nodes {
               id,
               content,
               role,
               errors,
               isFetching
             }
           }
         }
       }
     }

Note

Since clearing messages over the token limit is not yet implemented you can nuke them all like this:

  1. rails c
  2. Destroy all messages
Ci::Editor::AiConversation::Message.destroy_all

@andrei.zubov, I wasn't sure if you wanted them with created_at asc of desc. If you want to return the results in the opposite order you can apply this patch:

diff --git a/ee/app/models/ai/project/conversations.rb b/ee/app/models/ai/project/conversations.rb
index 081aebd38f48..91ca830ef2c5 100644
--- a/ee/app/models/ai/project/conversations.rb
+++ b/ee/app/models/ai/project/conversations.rb
@@ -9,7 +9,7 @@ def initialize(project, user)
       end

       def ci_config_messages
-        Ci::Editor::AiConversation::Message.where(project: @project, user: @user).order(:created_at)
+        Ci::Editor::AiConversation::Message.where(project: @project, user: @user).order(created_at: :desc)
       end
     end
   end

TODOS

  • We have deduplication but if the content is changed they could send another message while the other one is processing. We should make sure that the service which calls the worker locks the resource for editing in some way. We could just return if the last message for the user/project combo is
  • Add feature checks for ultimate
  • Add tests
  • Take token limits into consideration (clear messages older than 90 days based on legal limitations)
  • Write code to clear messages older than 90 days
Edited by Allison Browne

Merge request reports