
PullRequest Work


About PullRequest
PullRequest provides on-demand code review by a network of contracted engineers for development teams of any size. We integrate with major code repository softwares to post comments on pull requests and also provide a customer web application where users can view code review analytics, metrics, and benchmarks as well as access their settings for projects, advanced rules, membership, and billing. We also have a separate dedicated application for our network of reviewers, where they are able to review our customers’ code and pick up non-review related jobs.
My Role
As the sole, dedicated product designer at PullRequest, I employ the UX process to create designs for all new major app features on both our customer-facing app and reviewer-facing code review platform.
Working at a small, agile startup with a consistent feedback loop with customers and reviewers has allowed our design & implementation process to work at a fast pace. We’re constantly gathering feedback and analyzing our users’ activity to determine how we can best solve for their needs.
In this portfolio project, I’ll give high level overviews of two product features from our customer app, the Advanced Review Settings feature, and our Dashboard & Metrics feature.
Customer Product Feature: Advanced Review Settings
Background Information:
Our customers are directed to set up project settings for each repository they connect to our platform. Users are able to designate either Auto Review, which will automatically send all of their pull requests to our network for review, or Manual Review, which requires them to manually request reviews via our platform.
The Problem:
Users are forced to approach their review settings in an “all or nothing” capacity. Oftentimes, pull requests are sent to the network even if they don’t need review when Auto Review is enabled. Alternatively, having to manually send in pull requests via the platform when Manual Review is enabled can be a laborious and time-consuming task for larger teams with a lot of pull requests.
The Goal:
Provide our users with a more advanced settings option, whereby teams can create organization or project-specific rules that would ensure the intended pull requests get sent to our network for review.
Responsibilities:
User research, UX Design, Visual Design, QA Analysis
The Process:
Research & Ideation - Designs - Testing & Iterations

Research & Ideation
Before jumping into designs, I conducted customer interviews to determine what kinds of factors are at play when deciding which pull requests they’d like to send to our network for review.
Key Interview Findings:
Users didn’t want pull requests sent to the network if they were above or below a certain number of lines of code.
Users only wanted pull requests sent to the network that were authored by certain team members.
Users wanted the option to set certain date parameters.
Users only wanted pull requests sent to the network from certain file paths or base branches.
Users wanted a way to add specific text within the pull request description, commit message, or title that would determine if it’s sent to our network.
Based on these interview findings, I compiled a list of action items that could be put into a conditional logic form to set specific rules. I also compiled a list of feature goals to ensure the feature was as user-friendly as possible and I covered all of the possible edge cases and states.
Form Inputs:
Actions: Open review, Don’t open review, Cancel review
Supported Fields: title, description, base branch, head branch, author name, commit message, comment message, summary comment message, diff file path, diff line count, creation date
Comparators: matches, contains, is equal to, is greater than, is greater than or equal to, is less than, is less than or equal to
Final Input Type: string, number, date
Feature Goals:
Add a way for users to add advanced settings based on a number of fields and comparators on both the Organization level and the Project level.
Create a way for users to establish a rule hierarchy to avoid conflicts if more than one rule applies.
Allow users to edit or delete existing rules.
Add descriptive copy to ensure a smooth UX (include example rules and explanatory text).

Designs
I’ve selected a number of designs to display, including an example user flow for adding a new project rule and some extra screens that cover additional states like edit and delete modals and error states. The displayed designs were the final product of numerous iterations.
Example User Flow
Additional Screens and Edge Cases

Impact
This feature has provided a positive impact on our customer user experience, our reviewer experience, and our company revenues. Many of our users have opted to enable “auto review” knowing that they can stipulate certain exceptions within the advanced settings tab. A higher volume of targeted pull requests are being sent to our network for review, and our customers save time by avoiding having to manually sift through their pull request queue to request reviews. In addition, reviewers are saving time by only providing valuable feedback on pull requests that require their attention.
Customer Feature: Dashboard & Metrics
Overview:
While our main product offering is code review as a service by our network of expert engineers, we wanted to find a way to expand our product in the form of additional insights and analytics surrounding our customers’ internal code review processes. This would allow our customers to improve their internal code review culture, while also driving more users onto our platform.
The Problem:
Because our tools integrate with customers’ existing cloud repository tools (GitHub, Bitbucket, and GitLab), users aren’t incentivized to engage with our platform on a regular basis. In addition, higher-level stakeholders need more information surrounding their development team’s performance as well as concrete stats about the value that PullRequest adds to their code review process.
The Goal:
Create a collection of visual metric-based assets where users can gain insight into their internal code review performance, compare themselves to industry benchmarks, understand the PullRequest value add, and view code review activity broken down by individual users and projects.
Responsibilities:
User research, UX Design, Visual Design, QA Analysis
The Process:
Research & Ideation - Information Architecture - Designs - Testing & Iterations

Research & Ideation
Much of the research process involved working closely with our team of backend engineers to determine what data we could gather from our users to display. I also gathered feedback from our users about what information they would find most useful.
Key Research Findings:
Our users want a way to compare their internal code review processes against industry standards.
Our users want to better understand the value that PullRequest is adding to their code review process.
Development team managers and higher-level stakeholders would like a way to view their development team’s performance broken down by individual users.
Our users want to understand how they can improve their code review process.

Defining the Scope
After conducting research, I worked with our engineers to organize data through information architecture. We separated our user’s needs into different feature categories that would address each need. We decided on the following distinct pages:
Organization Dashboard: This page will function as a landing page in our customer app, where users can view a high-level overview of metrics, benchmarks, PullRequest contributions, language breakdowns, and user performance. It would also link to our more detailed metric pages.
Advanced Metrics: This page will display both line and bar graphs relating to key author and reviewer performance metrics over the past 30 days. It will also show tables that compare user and project data as well as percentage change data since the last period.
Organization Benchmarks: This page will display code review activity metrics, as compared to teams of similar size. Users can see how they stack up against industry standards, learn how they can improve, and track their own organization’s progress.
User Metrics: These pages will display quadrant scatter plots that allow customers to view their teams’ performance in a more multi-faceted way than the Advanced Metrics pages alone. You can also filter by user to see a breakdown of individual stats surrounding author and review volume.

Designs
Organization Dashboard & Advanced Metrics

Organization Dashboard

Advanced Metrics (average)

Advanced Metrics (total)
Benchmarks & User Metrics

Impact
Our dashboard and metrics features have provided immense value to our customers. We’ve specifically gotten a lot of positive feedback from one of our largest customers, Trilogy, who especially appreciate the ability to track their individual users’ stats and better understand their internal performance. The benchmarks page has allowed our users to gain insight into their industry standings, while also equipping them with the knowledge to improve their processes. The feature is relatively new, and I’m looking forward to tracking the continued progress and value these metrics will bring our customers.