Add a method to the ai client which allows for passing message history
What does this MR do and why?
We are building a chat bot to help generate ci files. https://gitlab.com/gitlab-org/gitlab/-/issues/407739
For the openai api to know about the history of the conversation we need to pass it all of the previous context in each request.
The existing chat
message only allows for one message from a user, and this new functionality will need to pass back responses from the ai with the 'assistant' role.
How to set up and validate locally
- Navigate to the rails console
rails c
- Enable the
:openai_experimentation
feature flag that thellm_client
is behind> Feature.enable(:openai_experimentation)
- Setup the api key locally. You have to request one: https://gitlab.slack.com/archives/C0517A9PZ8S/p1681445186381439
> Gitlab::CurrentSettings.update(openai_api_key: "<your-key>")
- Run a chat request:
=> Gitlab::Llm::OpenAi::Client.new(User.first).chat(content: 'What frameworks are popular to use with the ruby programming language?') => {"id"=>"chatcmpl-123", "object"=>"chat.completion", "created"=>1682009171, "model"=>"gpt-3.5-turbo-0301", "usage"=>{"prompt_tokens"=>20, "completion_tokens"=>122, "total_tokens"=>142}, "choices"=> [{"message"=> {"role"=>"assistant", "content"=> "Some popular frameworks to use with the Ruby programming language are:\n\n1. Ruby on Rails - a web application framework that follows the Model-View-Controller (MVC) architectural pattern\n2. Sinatra - a lightweight web application framework that allows developers to create small web applications quickly and easily\n3. Hanami - a modern web application framework that emphasizes modularity and flexibility\n4. Padrino - a web framework built on top of Sinatra that adds more features and functionality for larger web applications\n5. Grape - a framework for building RESTful APIs in Ruby that is designed to be simple and fast."}, "finish_reason"=>"stop", "index"=>0}]}
- Use the response to run a chat messages request with a contextual response:
> Gitlab::Llm::OpenAi::Client.new(User.first).messages_chat( messages: [ { role: 'user', content: 'What frameworks are popular to use with the ruby programming language?' }, { role: 'assistant', content: response["choices"][0]['message']['content'] }, { role: 'user', content: 'Can you write that in a bullet format list?' }, ]) => {"id"=>"chatcmpl-123", "object"=>"chat.completion", "created"=>1682009603, "model"=>"gpt-3.5-turbo-0301", "usage"=>{"prompt_tokens"=>162, "completion_tokens"=>38, "total_tokens"=>200}, "choices"=> [{"message"=> {"role"=>"assistant", "content"=> "Sure, here's a bullet format list of popular frameworks to use with the Ruby programming language:\n\n- Ruby on Rails\n- Sinatra\n- Hanami\n- Padrino\n- Grape"}, "finish_reason"=>"stop", "index"=>0}]}
Edited by Allison Browne