Top 10 Best Practices for Peer Reviews

Here is a nice list of best practices for peer reviews from SmartBear. These parallel the recommendations I usually give, but it is nice to have the longer version readily available too (see link below).

1. Review fewer than 200-400 lines of code at a time.
2. Aim for an inspection rate of less than 300-500 LOC/hr (But, see comment below)
3. Take enough time for a proper, slow review, but not more than 60-90 minutes
4. Authors should annotate source code before the review begins
5. Establish quantifiable goals for code review and capture metrics so you can improve your process
6. Checklists substantially improve results for both authors and reviewers
7. Verify that defects are actually fixed
8. Managers must foster a good code review culture in which finding defects is viewed positively
9. Beware the "Big Brother" effect  (don't use metrics to punish people)
10. The Ego Effect: do at least some code review, even if you don't have time to review it all

And now my comments:  The data I've seen shows 300-500 LOC/hr is too high by a factor of 2 or so. I recommend 100-200 lines of code per hour for 60-120 minutes. It may be that SmartBear's tool lets you go faster, but I believe that comes at a cost that exceeds the time save.

I deleted their best practice #11, which says that lightweight reviews are great stuff, because I don't entirely buy it.  Everything I've seen shows that lightweight reviews (which they advocate) are better than no reviews, and for that reason perhaps they make a good first step. But if you skip the in-person review meeting you're losing out on a lot of potential benefit.  Your mileage may vary.

You can get the full white paper here: http://support.smartbear.com/resources/cc/11_Best_Practices_for_Peer_Code_Review.pdf
 (They are not compensating me for posting this, and I presume they don't mind the free publicity.)