@ogolowinski@rayana yay! Ok now we start crafting the discussion guide! You can use the link to that template - just save a copy in your drive and modify it.
When you're ready, share it with me and ping me in the issue so I can help! :)
We'll also need a recruiting request and a screener created as well. (link to the screener template - works the same way as the discussion guide)
@ogolowinski, @mnichols1 needs to start fleshing out the script part of the guide. @mnichols1 please let me know how I can support you with this part of the process. In light of our recent conversation, I suggest you take a stab at writing the questions for the script so others can read them and know what you asked in those sessions - you don't have to use the script when you run the sessions. :)
Also, I don't know if we have a recruiting request or a timeline for when y'all want to do the sessions for this one. We'll also need a screener (if we dont' have one yet). :)
Found a bug - if you choose "no" to do you use gitlab it asks if you use AutoDevops - this can be skipped as this is specific to GitLab
The questions stop really quickly. Unlike our previous surveys - I assume that this is because we want to schedule interviews and not necessarily get the answers we need yet. If this is the case, I think we can remove the company industry question, the job title, and even the GitLAb tier and platform
@loriewhitaker great - saw its fixed.
Can we remove the following questions
company industry question, the job title, and even the GitLAb tier and platform
@ogolowinski I removed the industry question and the job title question.
I left the GitLab question in there since you said that the AutoDevops question is specific to GL. Right now, if you say 'yes' to using GL, you get taken to the AutoDevops question. If you say 'no' to using GL, you are taken to the Kubernetes question. Let me know if that's wrong! And I'll fix it. :)
@ogolowinski Also, I'm not sure about recruiting and interviewing. We have a discussion guide, but I'm unclear on who will be conducting the research. Initially Mike was going to, but that was before he moved to his new team. Unfortunately I can't take this research on as I am taking on the MR research across all stage groups soon.
We need to start a recruiting issue to bring in Emily to see what her thoughts are. We could use Respondent.io or User Interviews.com to find people in addition to our database, but I'll let her weigh in.
@ogolowinski I have created a recruitment issue! The next step would be to create the screener survey needed for it! Let me know if you want help with that!
I captured:
Observe alerts from the monitoring system that causes the deployment to rollback.
Exceptions (like in Ruby) - like a panic, driven by what the user sets - a family of golden signals around API latency, CPU, mem usage - if this exceeds the threshold then it rollbacks.
@ogolowinski awesome! the yt video seems private . Also, is there a UXR insight issue and interview issue we can reference towards? Or alternatively, it might make sense to create that as a result from this research.
@ogolowinski can you create the issue we discussed in our 1:1 regarding:
Another aspect, different kind of metrics. Some metrics require automatic roll back by default. We need to see how to set these configurations. Requires design thinking.
@ogolowinski sounds good! The issue's in &3357 (closed) are intended for future milestones and should not touch the immediate milestone preferably as these are to push discussion instead of straight-up solution design work
As discussed in recent insights meeting the next steps will be to set up a think big session with Eng to go over the direction we want to take this in and what feature implementation issues we can create to support that direction with research fueling our decision making.
@dimitrieh do we want to close this particular research issue as it was directly related to customer research? if so, do we need a design issue opened to capture the work with Eng?