Resources Section Image TEP Home Page

Resources Home

Evaluation of Teaching

Universal Design


TEP newsletter subscribe icon with link Twitter logo with link to TEP Twitter

The Peer Review of Teaching

(NOTE: The below links will open in a new browser tab or window)

Periodic peer review of teaching is required for all faculty by UO Senate legislation passed in 1996 and the 2015-2018 Collective Bargaining Agreement between the University and the United Academics.  How to carry out that peer review was left mostly up to individual departments, with the result that the procedures adopted vary widely across campus.  TEP recommends that departments adopt procedures and rubrics that guide assessment of how the faculty member employs practices shown to improve student learning [1] [2] [3] [4] [5]. This would allow for a more consistent and formative review process, benefiting students, faculty, and the university as a whole.

To this end, in Spring 2016 TEP partnered with the Office of the Provost and Academic Affairs (OPAA) to offer principles and tools that support departments in developing and implementing the peer review process in ways that enhance its value to individual members of the faculty and to the UO teaching and learning community.  The guiding principles may be found on the OPAA website here, and TEP’s specific procedural recommendations and rubrics may be found below, along with examples of procedures and documents generously provided by a sampling of departments.

Peer Review of Teaching Example Documents

TEP Recommended

Other UO Examples

Department of Human Physiology
American English Institute
Arts and Administration Program
Review Procedure

HPHY Department Procedures
(Word doc - PDF File)

AEI Summative Observation Process
(Word doc - PDF File)

AAd Peer Review Procedure
(Word doc - PDF File)
Observation Instrument

Classroom Observation Protocol for Undergraduate STEM
(Excel file - Example COPUS file with dummy data)

AEI Summative Observation Standards
(Word doc - PDF File)

AEI Formative Peer Observation Form
(Word doc - PDF File)

AAd Observation Form
(Word doc - PDF File)


AEI Goal-Setting Form
(Word doc - PDF File)

AAd Self-Appraisal Form
(Word doc- PDF File)

HPHY Department Template
(Word doc- PDF File)

AAd Summary Form
(Word doc - PDF File)
Workload Expectation for Review
About 5 hours
5 – 6 hours
About 5 hours
For more information contact

TEP includes its own documents above, alongside other models generously provided by UO’s Department of Human Physiology (CAS), Arts and Administration Program (AAA), and the American English Institute (CAS)—these models were developed through thoughtful, collaborative processes and shared in the hope of being helpful to colleagues. They range from highly specific and data driven (Human Physiology), to more open ended (Arts Administration).

The strengths of data-driven tools are that they draw the reviewer’s attention to specific, itemized "actions" of teachers and learners: they generate information about the class separate from a judgment of quality (though in these examples, quality is implied inasmuch as the actions on the list are linked to research on student learning). And, ideally, they give the reviewer a fresh, more objective way of looking at the class. The disadvantages of data-driven tools—perhaps especially the Classroom Observation Protocol for Undergraduate STEM, used by the Human Physiology department—are that they require training and may defer worthwhile impressions about quality. 

Other, more open-ended tools give the reviewer more latitude to record impressions. Such instruments have less specific observation prompts and instead encourage development of a holistic narrative about the faculty member’s performance. Such instruments may be easier to adapt for classes that do not follow the standard meeting format (like laboratory, studio, or field classes). But tools offering only loose guidance have disadvantages as well: they make it more likely that a review will reflect a reviewer’s personal predispositions about what constitutes good teaching, leading to less consistency between reviewers and making comparisons between faculty difficult.

An annotated list of other available observation instruments, self-assessment tools, self-efficacy scales, and instruments for graduate student teacher development and self-efficacy can be found at the American Society for Biochemistry and Molecular Biology site. The instruments on the list have generally been extensively tested for reliability and validity.

N.B. These materials primarily focus on the review of face to face classes. To assess the quality of a fully online course, TEP offers these online considerations and directs faculty to the new Policy on Undergraduate Online and Hybrid Courses: Student Engagement.

Works Cited

[1] S. Freeman, S. L. Eddy, M. McDonough, M. K. Smith, N. Okoroafor, H. Jordt and M. P. Wenderoth, "Active learning increases student performance in science, engineering, and mathematics," Proceedings of the National Academy of Sciences of the United States of America, vol. 111, no. 23, p. 8410–5, 2015.

[2] S. A. Ambrose, M. W. Bridges, M. DiPietro, M. C. Lovett and M. K. Norman, How Learning Works: Seven Research-Based Principles for Smart Teaching, Hoboken, New Jersey: Jossey-Bass, 2010.

[3] J. E. Froyd, "White Paper on Promising Practices in Undergraduate STEM Education (Evidence on Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education Workshop 1)," 2008.

[4] J. D. Bransford, A. L. Brown and R. C. Cocking, Eds., How People Learn: Brain, Mind, Experience, and School, Washington, D.C.: National Academy Press, 2000.

[5] P. C. Brown, H. L. Roediger and M. A. McDaniel, Make it Stick: The Science of Successful Learning, Cambridge, Massachusetts: Belknap Press, 2014.

[Back to text]