Generate a cube query using Vertex LLM
What does this MR do and why?
- Adds a new AI Action
:generate_cube_query
that takes a question and returns a query that is a valid Cube query using Cube query format.
Things this MR is missing
-
Specs -
Battle-testing the LLM prompt -
Authorization, currently anybody can use it. -
Prompt injection protection against access to other projects' data -
...probably other things
Screenshots or screen recordings
Screenshots are required for UI changes, and strongly recommended for all other merge requests.
Before | After |
---|---|
How to set up and validate locally
- Check out branch
- Enable AI features locally: https://docs.gitlab.com/ee/development/ai_features/#test-ai-features-locally (You might be able to skip the embedding database, I'm not sure)
- Make sure you've enabled Vertex access and configured it with your GDK. This action does not use Anthropic's API, only Vertex.
- Enable the
:expanded_ai_logging
feature flag locally. - Enable the
:generate_cube_query
flag locally. NEW - Update the code here from
perform_async
toperform_inline
. (This just makes testing easier, don't commit it anywhere.) - In another terminal, tail the LLM logs in your GDK:
tail -f log/llm.log
- Make a GraphQL call like this - feel free to update the
question
parameter:
mutation {
aiAction(input: {generateCubeQuery: {question: "How many unique users used the application this week?", resourceId: "gid://gitlab/Project/19"}, clientSubscriptionId: "ARandomID"}) {
clientMutationId
errors
}
}
- Assuming everything was set up correctly, you should (after a second or two), see a response from Vertex with a cube query
👇 This will also be send to any open GraphQL subscriptions, but looking at the logs is a quicker way to see that it's worked.
MR acceptance checklist
This checklist encourages us to confirm any changes have been analyzed to reduce risks in quality, performance, reliability, security, and maintainability.
-
I have evaluated the MR acceptance checklist for this MR.
Edited by Max Woolf