Difference between revisions of "APB-Code review"
m (→Schedule) |
m (→Schedule) |
||
Line 85: | Line 85: | ||
<tr class="sh"><td colspan="7"> </td></tr> | <tr class="sh"><td colspan="7"> </td></tr> | ||
<tr class="s1"><td>October 17</td><td>[http://steipe.biochemistry.utoronto.ca/abc/students/index.php/User:Audrina_Zhou/BCB410_Journal pcclust]</td><td>'''Audrina Zhou'''</td> <td>Yin Yin</td><td>Cait Harrigan</td><td>Chantal Ho</td><td>Deus Bajaj</td></tr> | <tr class="s1"><td>October 17</td><td>[http://steipe.biochemistry.utoronto.ca/abc/students/index.php/User:Audrina_Zhou/BCB410_Journal pcclust]</td><td>'''Audrina Zhou'''</td> <td>Yin Yin</td><td>Cait Harrigan</td><td>Chantal Ho</td><td>Deus Bajaj</td></tr> | ||
− | <tr class="s2"><td>October 17</td><td>[http://steipe.biochemistry.utoronto.ca/abc/students/index.php/User:Liwen_Zhuang/BCH410_Project_Page VISMCL.git]</td><td>'''Liwen Zhuang'''</td> <td> | + | <tr class="s2"><td>October 17</td><td>[http://steipe.biochemistry.utoronto.ca/abc/students/index.php/User:Liwen_Zhuang/BCH410_Project_Page VISMCL.git]</td><td>'''Liwen Zhuang'''</td> <td>Judy Lee</td><td>Xindi Zhang</td><td>Fan Shen</td><td>Justin Lee</td></tr> |
− | <tr class="s1"><td>October 17</td><td>[http://steipe.biochemistry.utoronto.ca/abc/students/index.php/User:Deus_Bajaj/BCB410_2018_Project vslyzr]</td><td>'''Deus Bajaj'''</td> <td> | + | <tr class="s1"><td>October 17</td><td>[http://steipe.biochemistry.utoronto.ca/abc/students/index.php/User:Deus_Bajaj/BCB410_2018_Project vslyzr]</td><td>'''Deus Bajaj'''</td> <td>Liwen Zhuang</td><td>Yoonsik Park</td><td>Yiqiu Tang</td><td>Audrina Zhou</td></tr> |
<tr class="s2"><td>October 17</td><td>[http://steipe.biochemistry.utoronto.ca/abc/students/index.php/User:Xindi_Zhang/BCB410_Project_page SigAct]</td><td>'''Xindi Zhang'''</td> <td>Emily Ayala</td><td>Yufei Yang</td><td>Doga Ister</td><td>Han Zhang</td></tr> | <tr class="s2"><td>October 17</td><td>[http://steipe.biochemistry.utoronto.ca/abc/students/index.php/User:Xindi_Zhang/BCB410_Project_page SigAct]</td><td>'''Xindi Zhang'''</td> <td>Emily Ayala</td><td>Yufei Yang</td><td>Doga Ister</td><td>Han Zhang</td></tr> | ||
<tr class="sh"><td colspan="7"> </td></tr> | <tr class="sh"><td colspan="7"> </td></tr> |
Revision as of 01:25, 10 October 2018
Code and Design Review
(Code review sessions - conduct and expectations)
Abstract:
This page explains conduct of and expectations for code and design reviews.
Objectives:
|
|
Deliverables:
Your participation as reviewers in the Code review sessions will be worth 8 marks in each panel for a total of 4 x 8 marks maximum.
Contents
Contents
We call our review sessions "Code reviews" for simplicity, however the purpose is not only to review code, but also the design, contents, and form of the submitted material.
Code reviews are done for course credit of the reviewers. The units are already going to have been marked by the instructor before the review. Critical reviews are not going to affect that mark, there is no need to hold back on criticism to protect your peers for that reason. The best way to help your peers is to provide a high-quality review, with creative, constructive feedback. A well-prepared, thoughtful, knowledgeable review will be immensely valuable for your peers. And since you as the reviewer will be evaluated for credit, I hope that you will put your heart into it.
To repeat: your class mates will not benefit from a "soft" review since their grade has already been determined. A stringent review will give them the most opportunity to improve their code and earn marks in the resubmission. A high-quality review will earn the most marks for yourself - and will give yu the most satisfaction for a job well done.
- Principles
- Units are not to be modified, until after the code review. Do not edit, upload, commit, push, merge or otherwise modify any material you have submitted for credit until after the review is done.
- We will use four weeks for reviews. We will schedule five to six reviews per class session, with four reviewers. This will give everyone approximately five minutes for their review contributions.
- Everyone will review four units.
- Typically everyone will participate in one review at every session, and never more than two.
Schedule
Date | Title | Author | Reviewer 1 | Reviewer 2 | Reviewer 3 | Reviewer 4 |
October 10 | pointszr | Cait Harrigan | Doga Ister | Judy Lee | Yoonsik Park | Xindi Zhang |
October 10 | ScoreVisualizer | Judy Lee | Han Zhang | Cait Harrigan | Yin Yin | Chantal Ho |
October 10 | align.git | Han Zhang | Audrina Zhou | Fan Shen | Liwen Zhuang | Justin Lee |
October 10 | flagVis | Yoon Park | Yiqiu Tang | Emily Ayala | Deus Bajaj | Yufei Yang |
October 17 | pcclust | Audrina Zhou | Yin Yin | Cait Harrigan | Chantal Ho | Deus Bajaj |
October 17 | VISMCL.git | Liwen Zhuang | Judy Lee | Xindi Zhang | Fan Shen | Justin Lee |
October 17 | vslyzr | Deus Bajaj | Liwen Zhuang | Yoonsik Park | Yiqiu Tang | Audrina Zhou |
October 17 | SigAct | Xindi Zhang | Emily Ayala | Yufei Yang | Doga Ister | Han Zhang |
October 24 | tmine | Justin Lee | Han Zhang | Chantal Ho | Yiqiu Tang | Xindi Zhang |
October 24 | nonexonmap.git | Fan Shen | Deus Bajaj | Yufei Yang | Judy Lee | Doga Ister |
October 24 | genomeRegionsInfo | Yiqiu Tang | Cait Harrigan | Emily Ayala | Yoonsik Park | Yin Yin |
October 24 | KaKsMap | Emily Ayala | Audrina Zhou | Justin Lee | Fan Shen | Liwen Zhuang |
October 31 | HiCViz | Chantal Ho | Xindi Zhang | Yin Yin | Han Zhang | Justin Lee |
October 31 | BreakViz | Yin Yin | Audrina Zhou | Deus Bajaj | Yiqiu Tang | Fan Shen |
October 31 | Rview | Doga Ister | Liwen Zhuang | Yufei Yang | Judy Lee | Chantal Ho |
October 31 | MSPTM | Yufei Yang | Emily Ayala | Yoonsik Park | Cait Harrigan | Doga Ister |
Preparation
- The entire class is expected to have familiarized themselves with all submitted package before the first review, to establish context.
- The entire class is expected to have installed and assessed the packages that are scheduled for review each review session, so that everyone can follow the discussion and contribute.
- The designated reviewers of a package have worked through the material in detail, have made themselves knowledgeable about the context and background, and have prepared their review contributions (see below). I expect that reviewers will come to class very well prepared, and that they consider the unit with reference to the expectations set out in the evaluation rubrics for software design, code and documentation.
During the review
Reviews proceed in three rounds: first, the lead reviewers ask their most important prepared questions in turn; second, the reviewers lead a more general round of discussion; finally the discusion is opened to the entire class. Each review will take approximaetley twenty minutes.
- Code will not be presented by the author (unfortunately we don't have enough time), but the reviewers may ask some initial questions for clarification. Other than that, familiartity with the unit is assumed.
- Reviewers will comment on issues focussing on importance, visual appeal and utility of the package. Ideally, reviewers will make specific suggestions for improvement but it is better to point out a weakness, even if you don't quite know how to address it, than to remain silent. Once it is pointed out, others may have useful ideas. Of course, if you note particular strengths of the unit, that is also welcome.
- Issues for discussion could include:
- Suggestions to make the objectives of the tool more clear.
- Improvements to integrating the unit with existing packagesothers (but without introducing unnecessary dependencies).
- Constructive critique of software design decisions.
- Improvements to examples to better illustrate the concepts.
- Addressing any unstated assumptions.
- Identifying significant dependencies that could become obstacles to refactoring.
- Flagging, where the material lacks rigour or is factully incorrect.
- Improvements to form and layout.
- Identifying code that does not conform to coding style.
- Identifying code that exemplifies poor practice ("anti-patterns", design smell", "code smell").
- Improvements to comments;
- Improvements to visuals;
- Flagging where the sample code might not be robust against faulty input.
- Flagging where the sample code might not be safe against overwriting user data.
- Any other issues ...
- During the review, reviewers take notes of responses and comments.
Overall, be mindful that code review is a sensitive social issue, and that the primary objective is not to point out errors, but to improve the entire "team".
After the review
- After the review, on the same day, the reviewers summarize their issues and proposals on the "Talk" page of the package synopsis on the Student Wiki (briefly, in point form);
- Once all suggestions are in, the unit author begins revisions.
- It is not mandatory that the revisions follow the reviewers' suggestions. Authors need to consider comments carefully, but apply their own best judgement. In the end, the reviewers are responsible for their reviews, but the author is responsible for their package.
A final note
I hope that in your career you will find yourself in a workplace where peer-review is a regular part of your activities. This may contribute tremendously to better outcomes, more transparent and more meaningful work, and more cohesive teams. When that time comes, your skills as reviewers will be evaluated implicitly, although perhaps neither you, your supervisor, nor your project lead might realize this. You will be prepared.
Notes
Further reading, links and resources
If in doubt, ask! If anything about this learning unit is not clear to you, do not proceed blindly but ask for clarification. Post your question on the course mailing list: others are likely to have similar problems. Or send an email to your instructor.
About ...
Author:
- Boris Steipe <boris.steipe@utoronto.ca>
Created:
- 2017-10-13
Modified:
- 2018-09-12
Version:
- 1.1
Version history:
- 1.1 2018 updates
- 1.0 New unit
This copyrighted material is licensed under a Creative Commons Attribution 4.0 International License. Follow the link to learn more.