At GitLab, we're passionate about using our own products internally, a.k.a. dogfooding. Dogfooding has led to significant improvements in accelerating our software delivery cycle time for customers. This article spotlights a specific use case where GitLab Value Stream Management (VSM) has driven significant improvements for our engineering team. You'll learn how VSM helped us tackle two critical challenges: measuring the journey from idea conception to merge request completion, and streamlining our deployment workflows.
The Challenge: Identifying bottlenecks in MR reviews
Despite having well-defined workflows, one team noticed that MRs were taking longer than expected to be reviewed and merged. The challenge wasn’t just about the delays themselves, but about understanding where in the review process these delays were happening and why.
Our team’s goal was clear:
- Identify where time was being spent from the initial idea to the final merge of an MR.
- Pinpoint specific bottlenecks in the review process.
- Understand how MR size, complexity, or documentation quality affect review time.
The Approach: Measures the MR review time in GitLab Value Stream Analytics
Value Stream Analytics (VSA) enables organizations to map their entire workflow from idea to delivery, distinguishing between value-adding activities (VA) and non-value-adding activities (NVA) in the process flow. By calculating the ratio of value-added time to total lead time, the team can identify wasteful activities resulting in delays in MR reviews.
To obtain the necessary metrics, the team customized GitLab VSA to gain better visibility into our MR review process.
1. Setting up a custom stage for MR review
The team added a new custom stage in VSA called Review Time to Merge to specifically track the time from when a reviewer was first assigned to when the MR was merged.
- Start event: MR first reviewer assigned
- End event: MR merged
By defining this stage, VSA began measuring the duration of the MR review process, giving us precise data on where time was being spent.
2. Using the Total Time Chart for clarity
With the custom stage in place, the team used the Total Time Chart on the VSA Overview page (Analyze > Value Stream) to visualize how much time was spent during the new MR Review stage. By comparing the values represented by each area on the chart, the team could quickly identify how this stage contributed to the total software delivery lifecycle (SDLC) time.
3. Drilling down for deeper insights
To investigate specific delays, the team used the Stage Navigation Bar to dive deeper into the MR Review stage. This view allowed them to:
- Sort MRs by review time: The stage table showed all related MRs, sorted by review duration, making it easy to detect slow MRs.
- Analyze individual MRs: For each MR, that team could examine factors such as reviewer assignment delays, multiple rounds of feedback, idle time after approval, and MR size/complexity.
The outcome: Actionable insights and improvements
By customizing VSA to track MR review time, the team uncovered several key insights:
- Delays in reviewer assignment: Some MRs experienced delays because reviewers were assigned late, or reviewers had too many MRs in their queue.
- Slow review start times: Even after assignment, certain MRs sat idle before reviews began, often due to context switching or competing priorities.
- Multiple feedback loops: Larger MRs often required multiple rounds of feedback, which extended review time significantly.
- Idle time post-approval: Some MRs were approved but not merged promptly, often due to deployment coordination issues.
For the engineering manager on the team, VSA proved to be transformational/valuable in managing their team's workflow: "I've used the VSA to justify where we were spending time in MR completion. We have VSA customized to our needs, and it's been very beneficial to our investigations for opportunities for improvements.”
Also, from this dogfooding experience, we’re now developing a key enhancement to improve visibility into the review process. We're adding a new event to VSA — Merge request last approved at — which creates a stage that breaks down MR review steps even further for granular visibility.
The power of data-driven decisions
By leveraging GitLab’s VSA, we didn’t just identify bottlenecks – we gained actionable insights that led to measurable improvements in MR review time and overall developer productivity. We optimized merge request review cycles and increased developer throughput, validating our commitment to continuous improvement through measurement.
Want to learn more about how VSA can help your team? Start a free, 60-day trial of GitLab Ultimate, customize your value streams, and see how you can make improvements throughout the SDLC for your teams.