Skip to content
sean.horgan.net
Leading PMs

icon picker
Performance Reviews for PMs

Sean Horgan

The System

You should think of performance review process as an important part of a continuous feedback cycle that is connected your company’s strategy and operations. Performance reviews should be connected to your hiring process. You shouldn’t assess people coming into the building with a different set of criteria that you use for people already in the building.

Principles

Feedback is a gift.
No surprises. PMs on your team should be getting continuous feedback on their work. If there is a rating system in place, you should set expectations early in each cycle. If there is a common rating that most people get when they met expectations then you should tell people that unless you’ve explicitly discussed another rating, they will get that common rating.
Context is king. What stage product is your PM working on? Are you pre Product-Market-Fit (PMF), post PMF, or mid PMF?
1
Do’s:
Be descriptive, provide concrete examples to support your case.
Consider the entirety of your working relationship. Review work materials to get a glimpse of employees' entire impact.
Consider both the “what” and the “how”, referring to specifics in the job role & level.
Offer constructive criticism to help improve the employees work product.
Block time on your schedule. Dedicated time without distractions will allow more thorough feedback.
Include examples of when the employee exemplified company values.

Don’ts:
Provide commentary without supporting evidence.
Be vague, or ramble with no clear direction.
Rush. Peer feedback is used to support performance ratings, and calibrate employees across the org. By not spending enough time on your feedback, you could be selling your peer short.
Apply biases. Review the most common forms to check yourself .
Pass judgement on someone based on one occurrence or instance of a concern. Instead look for patterns of behavior or performance.
Relay stories instead of facts about what actually happened.
There are no rows in this table

Giving Feedback

Instead of making blanket statements or giving personal opinions which can be subjective, try and paint the bigger picture. Consider using the SBI format: Situation, Behavior, Impact.
Situation - Describe when and where the action or behavior was seen
Behavior - Stick to the indisputable facts. What did you specifically see or hear?
Impact - What impact was felt or observed
This model works for both positive and constructive feedback. Let’s see it in action:

unnamed.png

Getting feedback from peers


Feedback from non PMs

You need to provide some context on the role when you want to get feedback on PMs on your team from people who don’t have a deep understanding of the attributes or levels in product management. I put together a short email that I send out that was inspired by .
Template
Given the nature of the PM role, feedback from XFN peers is an important part of career development. To align your feedback with the expectations of a PM, I’ve put together some framing questions below. You can learn more about the role by level in . I will not directly attribute feedback unless you ask me to. Let me know if you have questions and thanks for your time.
Feel free to share whatever you’d like but if you’d like some guidance you can use this template:
Highlights: where did they outperform this cycle?
Growth areas: where should they focus their efforts in the next cycle?
Functional rubric: Please rate each question using a scale of [Misses, Meets, Exceeds, Exemplary] along with specific examples where you can:
How well do they understand the market landscape, customers, users and our products?
How well do they translate this understanding into compelling strategy and product proposals?
How well do they translate strategy into goals, priorities, and plans that mobilize our teams?
How well did they communicate and collaborate with team members, other teams at the company, and external partners?
How well did they deliver impact on our highest priority objectives?

Shorter variation
Impact: Did they deliver impact on our highest priority objectives?
How they got there:
How well do they translate their understanding of the market landscape, customers, users into a compelling long-term strategy?
How well do they deliver product solutions that realize the strategy and delight users?
How well do they collaborate & communicate with stakeholders, team members, other teams, and external partners?

Putting it all together

With feedback from you and peers in hand, I like to organize it all into 2 broad buckets: areas of strength & areas of growth. In each I map the feedback to PM attributes.

Delivering the review

If you are new to the process or new to the person you are reviewing, take the time to write a script for the actual review session. You should set aside specific time to dig into the review — avoid shoehorning the review into an existing 1:1 unless you know you have adequate time (usually 30 minutes).
If your performance review process includes a tool to collect & deliver feedback, you can either release it before, during, or after the review session. I like to release after and

Sample script
Thank the person for their work. Where appropriate, call out 1 or 2 highlights right at the top. Don’t treat this like a ‘shit sandwich’ — everyone see’s through that.
If there is a rating system, get right to that as it’s probably top-of-mind/ I find that the calibration summary, i.e. why not lower & why not higher, succinctly frames the rating.
Work through the detailed feedback - engage with the person throughout
Get aligned on next steps in the meeting
Schedule the next checkin

Avoiding Bias

Attentional biases

Biases that cause decisions to be influenced by less relevant information.
: Over-weighting the most recent information (e.g., projects at the end of the quarters)
Consider concrete data points from throughout the review period.
: Over-weighting a general impression (e.g., overall positive/negative feelings)
Consider concrete, behavioral examples over review period
: Over-weighting information that comes to mind easily (e.g., big, flashy projects)
Review archives (bugs, project plans) for more examples over review period
: Over-weighting personal explanations for performance; under-weighting situation-based explanations for performance (e.g., poor ability vs. lacked resources)
Consider context or situational factors that affected performance
: Applying positive/negative stereotypes to employees' social groups (e.g., gender, race, age)
Focus on concrete accomplishments &
: Over-weighting the first thought/rating
Consider significantly different perspectives from first thought/rating

Motivational biases

Biases that cause decisions to be influenced by desires and goals.
Central Tendency: Choosing a middle rating in order to play it safe and avoid risk
Provide justification, especially when scores hover around the middle
Agreement Bias/Spiral of Silence: Agreeing with others to avoid conflict
In calibration meetings, others may avoid conflict about a person’s rating via fight or flight. Take time to have the conversation and listen to their feedback.
Leniency Error: Rating an employee highly to avoid confrontation / make the employee feel good
Self-serving Bias: Inflating ratings of employee to make self look good
Similar-to-me Error: Rating employee highly based on similarities to you in order to self-enhance

Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.