Academic Program Assessment Annotated Bibliography

Academic Program Assessment Annotated Bibliography

Compiled by Kirk St. Amant, Peter England, James Johnston, Vanessa Killgore, Natalie Richardson, and Daniel Siepert.

 

Note: This annotated bibliography is organized into various thematic sections and sub-sections associated with program assessment. It is not meant to be a comprehensive document on the overall subject of program assessment. Rather, view it as a starting point for examining this topic. You are invited to contribute new annotations to this bibliography.

Benefits and Purpose

Allen, J. (2004). The impact of student learning outcomes assessment on technical and professional communication programs. Technical Communication Quarterly, 13(1), 93–108.

Abstract: Describes the processes of program assessment based on pedagogical goals. Qualities of good assessment. Clarifying programmatic context and using rubrics and matrices to evaluate student writing. Opportunities and choices that will make the technical communication faculty’s experience more meaningful and manageable. How larger academic technical communication programs can benefit from such work and how assessment helps programs meet professional expectations.

De Valenzuela, J. S., Copeland, S. R., & Blalock, G. A. (2008). Unfulfilled expectations: Faculty participation and voice in a university program evaluation. Teachers College Record, 107(10), 2227–2247.

Abstract: A Qualitative case study of “faculty perceptions of the purposes, cost, and benefits of program evaluation” at a large public research university. Importance and benefits of stakeholder (teacher/faculty) participation and voice in assessment. Themes from faculty interviews: prior assumptions, contested purposes, and outcomes of unfulfilled expectations. Faculty assumed assessment would be a vehicle for their voice towards positive change, but were disappointed to find assessment a mere formality.

Hayhoe, G. F. (2007). Why should program assessment matter to practitioners? Technical Communication, 54(4), 407–408.

Abstract: Importance of assessment and accreditation in producing better teachers and curricula and in training effective graduates for the workforce. Example overview of engineering accreditation programs.

Herrington, T. K. (2003). Intertwining structures of assessment and support: Assessing programs—Advancing the profession. In proceedings from CPTSC ’03: 30th Annual Conference of the Council for Programs in Technical and Scientific Communication.

Abstract: Lessons learned from the author’s experience as an external assessor for San Francisco State University’s technical communication program.

Hovde, M. R. (2000). Assessing existing engineering communication programs: Lessons learned from a pilot study. In proceedings from CPTSC ’00: 27th Annual Conference of the Council for Programs in Technical and Scientific Communication.

Abstract: Lessons learned from the author’s experience spearheading the pilot year of assessment of students’ engineering communication skills.

Munger, R. H. (2000). Untangling a jigsaw puzzle: The place for assessment in program development. In proceedings from CPTSC ’00: 27th Annual Conference of the Council for Programs in Technical and Scientific Communication.

Abstract: Goal of CPTSC’s self-study and program review is to develop stronger programs, not to rank existing programs.

St. Amant, K. S., & Nahrwold, C. (2007). Acknowledging complexity: Rethinking program review and assessment in technical communication. Technical Communication, 54(4), 409–411.

Abstract: “Considers the nature of academic program review and assessment in technical communication. Examines the themes represented in the articles published in [Technical Communication, volume 54, issue 4].”

Criteria

Allen, J. (1993). The role(s) of assessment in technical communication: A review of the literature. Technical Communication Quarterly, 2(4), 365–388.

Abstract: Outlines what issues, assumptions, and ensuing questions should be addressed in a thorough program assessment.

Eaton, J. S. (2008, July/August). Attending to student learning. Change, 40(4), 22–27.

Abstract: Award criteria and process of the Council for Higher Education Accreditation (CHEA) Award for Institutional Progress in Student Learning Outcomes. Its focus on student achievement and accountability. Characteristics of outstanding progress in documenting learning: strong faculty leadership, attention to general education, preference for institutionally based strategies and instruments, focus on department and schools, and a valuable role for accreditation.

Rainey, K. T., Turner, R. K., & Dayton, D. (2005). Do curricula correspond to managerial expectations? Core competencies for technical communicators. Technical Communication, 52(2), 323–352.

Abstract: Survey and interviews with 67 technical managers reveal sought-after competencies and compare them to those stressed by the 10 largest undergraduate technical communication programs. Most important competencies: collaboration with subject matter experts and coworkers, ability to analyze user needs, ability to assess and learn technologies, initiative, and self-evaluation. Recom­mendations for programs, emerging trends, and future roles of technical communicators.

Models and Methods

Portfolios

Coppola, N. W. (1999). Setting the discourse community: Tasks and assessment for the new technical communication service course. Technical Communication Quarterly, 8(3), 249–267.

Abstract: Argues for a social perspective of technical communication service courses based on several premises, including the need for accountability and the value of portfolio assessment. Based on a case study that demonstrates reliability, stability, and validity in assessment, tasks, and instructor community. Stresses the effectiveness of portfolios as assessment tools.

Scott, C., & Plumb, C. (1999). Using portfolios to evaluate service courses as part of an engineering writing program. Technical Communication Quarterly, 8(3), 337–350.

Abstract: The University of Washington’s Portfolio Evaluation Project (PEP) is used to “inform and reform curriculum.” The inadequacy of multiple-choice testing and timed writing samples as evaluation tools. PEP design and methods. Results: performance-based outcomes, assessment criteria, and curriculum and instruction change.

Thomas, S., & McShane, B. J. (2007). Skills and literacies for the 21st century: Assessing an undergraduate professional and technical writing program. Technical Communication, 54(4), 412–423.

Abstract: “Describes an assessment process for professional and technical writing at Weber State University. Includes a review of the literature, outcomes for each course, and a rubric to evaluate student portfolios in the capstone course.”

Williams, J. M. (2001). The engineering communication portfolio: Writing, reflection, and technical communication assessment. In proceedings from IPCC ’01: IEEE International Professional Communication Conference (pp. 341–347). Available from http://ieeexplore.ieee.org/xpl/freeabs_all.jps?arnumber=971583

Abstract: Portfolio review is an effective and advantageous method for gathering data on students’ communication abilities. Defining good communication. Difference between individual assessment and program assessment and how portfolios provide information for both. Principles of portfolio administration: design/format, set up, objectives, evaluation rubrics. Explaining objectives to students. Making the process efficient.

Surveys

Beidler, J. (2002). Assessment: An alumni survey. In proceedings from FIE ’02: 32nd Annual Frontiers in Education (Vol. 2, pp. F1B/22). Retrieved from http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=1158120&isnumber=24586

Abstract: Assessment input comes from graduating seniors, alumni, and hiring organizations. An overview of a Web-based alumni survey composed of four parts: education analysis, programming language support, software concepts, and personal data.

Jobst, J. W. (1997). College curriculum and the assessment of recent graduates. In proceedings from STC ’97: Annual Conference of the Society for Technical Communication. Retrieved from http://www.stc.org/confproceed/1997/PDFs/0032.PDF

Abstract: Steps of assessment to determine students’ preparedness for the industry. Expected learning outcomes at Michigan Technical University. The evaluation determined through graduate survey and focus groups. Learn what skills are used and where they were learned.

McGourty, J., Besterfield-Sacre, M., Shuman, L. J., & Wolfe, H. (1999). Improving academic programs by capitalizing on alumni’s perceptions and experiences. In proceedings from FIE ’99: 29th Annual Frontiers in Education (Vol. 3, pp. 13a5/9–13a5/15). Retrieved from http://succeednow.org/papers/fie99/fie99-016.pdf

Abstract: Alumni surveys at Columbia University and the University of Pittsburg. How surveys can be developed, the cost involved, information that can be obtained, and how they can be used as part of the ABET Engineering Criteria 2000 process.

Review

Liang, Z. (2003). Using co-op reviews as an assessment tool. In proceedings from FIE ’03: 33rd Annual Frontiers in Education (Vol. 1, pp. T3B/31). Available from http://ieeexplore/ieee.org/xpl/freeabs_all.jsp?arnumber=1263322

Abstract: Discusses co-op visit and review procedure, its benefits, its methods, an example result, and a planned improvement of this method.

Rehling, L. (2003). Thank you, thank you! Or: How external reviewers help out. In proceedings of CPTSC ’03: 30th Annual Conference of the Council for Programs in Technical and Scientific Communication.

Abstract: “Conversations about assessment for technical communication programs often focus on evaluating features internally, through means such as course evaluations and portfolio reviews.”

Sides, C. H. (2007). First-person perspective: An analysis of informal and formal external program review strategies. Technical Communication, 54(4), 440–446.

Abstract: “Describes how ethics can be combined with use-inspired research to provide a foundation for external program reviews. Considers setting goals, selecting and preparing reviewers, conducting the reviews, and producing deliverables.”

Other Models

Anderson, P. V. (1995). Evaluating academic technical communication programs: New stakeholders, diverse goals. Technical Communication, 42(4), 628–633.

Abstract: Describes an approach to evaluation especially suited to technical communication programs. Discusses three “substantial challenges” to technical communication assessment.

Battle, M. V. (1993). Evaluation of training programs in technical communication. In proceedings from SCT ’93: Annual Conference of the Society for Technical Communication. Retrieved from http://www.stc.org/confproceed/1993/PDFs/PG267270.PDF

Abstract: The efficient production needs of executives and administrators. The CIPP-model of training program review, with examples. The benefits of professional evaluators and their ability to “lift the level of communication skills, the morale of the students and faculty, and the organization’s products.”

Carnegie, T. A. M. (2007). Integrating context into assessing US technical communication programs. Technical Communication, 54(4), 447–458.

Abstract: “Reviews the primary purposes for program reviews. Proposes the creation of a contextual program review model.”

Coppola, N. W., & Elliott, N. (2003). A behavioral framework for assessing graduate technical communication programs. In proceedings from CPTSC ’03: 30th Annual Conference of the Council for Programs in Technical and Scientific Communication.

Abstract: Behavioral science emphasizes association, reliability, and validity. Proposed model based on five independent variables “may be associated with effective programs in technical and scientific communication.”

Coppola, N. W., & Elliot, N. (2007). A technology transfer model for program assessment in technical communication. Technical Communication, 54(4), 459–474.

Abstract: “Offers a program assessment framework, centered on student performance, that has proven effective in establishing and assessing core competencies. Proposes a technology transfer model for the diffusion of program assessment knowledge.”

O’Rourke, N. (2000). The thorny issue of program assessment: One model for one program. In proceedings from CPTSC ’00: 27th Annual Conference of the Council for Programs in Technical and Scientific Communication.

Abstract: The controversial issue of assessment is vital. Need for plans and data gathering. Need for formal documentation. ABET as an example.

Shor, M. H., & Robson, R. (2000). A student-centered feedback control model of the educational process. In proceedings from FIE ’00: 30th Annual Frontiers in Education (Vol. 2, pp. S1A/14–S1A/19). Retrieved from http://web.engr.oregonstate.edu/~shor/Shor-Robson.PDF

Abstract: Compares two models of continuous improvement process: program-centered statistics and student-centered continuous feedback. The latter is more difficult to implement, but more likely to achieve desired learning outcomes. Implications for classroom and program practices. View the student, not the program, as the process. Measure performance as the student progresses through the program. Also, pitfalls of the student-centered model.

Technology and Assessment

Baskin, P. (2008, April 18). Electronic portfolios may answer calls for more accountability. Chronicle of Higher Education, 54(32), A30–A31.

Abstract: The advantages of using electronic compilations of student work to directly score specific skills and to evaluate the teaching program. Rose-Hulman as an early adopter of e-portfolios, federal pressure, practical uses, and the use of e-portfolios in other colleges.

DePiero, F. (2001). NetExam: A Web-based assessment tool for ABET2000. In proceedings from FIE ’01: 31st Annual Frontiers in Education (Vol. 2, pp. F3A/13). Retrieved from http://fie.engrng.pitt.edu/fie2001/papers/1218.pdf

Abstract: Advantages of NetExam over scantron testing: Statistics available online and easy review/comment capabilities. Exams generated on demand from a dynamic question database, which could include student-posed questions. Beta testing began in 2001.

Reece, G. A. (2002). Integrating technology into program assessment implementation and course design: “How can relational, problem-solving models help us reach our goals?” In proceedings from IPCC ’02: IEEE International Professional Communications Conference (pp. 166–184). Available from http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1049101

Abstract: Strategies for integrating technology into course design and program assessment. Background of technology-based learning and teaching. Role of problem-based learning. Demonstration of the new electronic relational system for tracking six assessment components: (1) assessment plan; (2) assessment activities; (3) course goals, contributions, and professional outcomes; (4) scoring rubrics; (5) process charts; and (6) survey information.

Sears, M., Campbell, K., & Whiteclaw, C. (2002). Faculty reward and promotion in distributed learning environments—Pedagogy in implementation. In proceedings from ICCE ’02: International Conference on Computers in Education (Vol. 2, pp. 1354–1355). Available from http://ieeexplore.org/xpl/freeabs_all.jsp?arnumber=1186247

Abstract: Need for established criteria to recognize technology-based instruction. Use of technology increases demands on instructors. Inherent difficulty and lack of consensus in defining good instruction. Peer Review of Instructional Technology Initiatives (PRITI) provides a process and model for faculty reward and promotion in distance learning environments.

ABET Engineering Accreditation

Davis, M. T., Olsen, L., & Haselkom, M. P. (1998). Proposal to support ABET accreditation for technical communication programs. Mercer University.

Abstract: “The Ad Hoc Committee on Accreditation recommends that the IEEE Professional Communication Society act as the sponsoring cognizant technical society to present technical communication program criteria to the Related Accreditation Commission (RAC) of Accreditation Body for Engineering and Technology (ABET). This report contains the background documentation for this recommendation.”

Haselkorn, M., Davis, M. T., Goodman, M. B., & Nolen, B. E. (1998). Seeking ABET accreditation for technical communication programs. In proceedings from IPCC ’98: IEEE International Professional Communication Conference (Vol. 2, pp. 195–196). Available from http://ieeexplore.ieee.org/xpl.freeabs_all.jsp?arnumber=722097

Abstract: Progress on issues of accreditation under ABET. Discusses accreditation standards, description of criteria, and how sample programs could meet the criteria. Benefits of accreditation for programs, graduates, industries, and the profession.

Williams, J. M. (2001). Transformations in technical communication pedagogy: Engineering, writing, and the ABET engineering criteria 2000. Technical Communication Quarterly, 10(2), 149–167.

Abstract: Evaluates the immediate and long-term effects of the Accreditation Board for Engineering and Technology (ABET) Engineering Criteria (EC) 2000. How ABET affects technical communication and how technical communication faculty is responding. More and better-integrated, cross-disciplinary communication courses for engineering students. Difficulty identifying effective assessment methods has resulted in experimentation with portfolios and cross-department grading. Curriculum reforms that stress the connection between education and work.

Williams, J. M. (2002). Technical communication, engineering, and ABET’s engineering criteria 2000: What lies ahead? Technical Communication, 49(1), 89–95.

Abstract: The increasing need for engineering communication and the Accreditation Board for Engineering and Technology (ABET) Engineering Criteria (EC) 2000. Considerations for unified objective development and mapping. Technical communication’s role in addressing this need.

Integration of Academia and Industry

Dakich, M. (1988). Improving communication training for engineers. In proceedings from IPCC ’88: On the Edge: A Pacific Rim Conference on Professional Technical Communication (pp. 273–276). Available from http://ieeexplore.ieee.org/iel2/766/908/00024049.pdf?tp=&arnumber=24049&isnumber=908

Abstract: Recaps assessment of communication training needs of engineers. Suggested methods of improve­ment include “analysis of current offerings, coordination of in-house training with colleges, an extension of college services to the industrial site, attendance of professional conferences, the establishment of a forum between colleges and the community, and internship programs in technical communications.”

Feinberg, S. G. (2000). Should academic programs in technical communication try to strengthen the bond between academia and industry? In proceedings from CPTSC ’00: 27th Annual Conference of the Council for Programs in Technical and Scientific Communication.

Abstract: Some issues present in both academia and industry: visualization of data, usability testing, the design of instructional material for the Web, research. Problems and questions of industry collaboration: who states the problem, who manages the project, what resources are available, and who owns the results?

Kim, L., & Tolley, C. (2004). Fitting academic programs to workplace marketability: Career paths of five technical communicators. Technical Communication, 51(3), 376–386.

Abstract: Interviews with five graduates of the University of Memphis master’s in technical communication program. Interviewees’ career paths and skills used on a daily basis. Comparison with current issues facing technical communication programs, such as a rapidly changing field and whether or not to teach tech­nological competencies.

Krestas, S. A. (1995). Future directions for continuing education in technical communication. Technical Communication, 42(4), 642–645.

Abstract: Integrating academic institutions with practical experience and industry professionals is key to the success of technical communication programs.

Program Development and Planning

Allen, J. (2000). Compact planning and program development: A new planning model for growing technical communication programs. In proceedings from CPTSC ’00: 27th Annual Conference of the Council for Programs in Technical and Scientific Communication.

Abstract: Compact planning—a narrowly focused, resource-driven planning model—helps programs identify and reach short-term goals. The value of short-term goals given the rapidly changing field and technology. Programs need to “remain nimble, competitive, and distinctive.” Compact planning’s inclusionary, grassroots process benefits program and department levels.

Moore, M. R. (2000). Participatory design and technical communication: Challenges and opportunities in programmatic assessment and evaluation. In proceedings from CPTSC ’00: 27th Annual Conference of the Council for Programs in Technical and Scientific Communication.

Abstract: “Technical Communication pedagogies that are informed by theories of Participatory Design offer new challenges and opportunities for both the assessment of student work and group projects and in the evaluation of programmatic goals.”

Tillery, D. (2003). Re-creating a Ph.D.: From technical to professional writing. In proceedings of CPTSC ’03: 30th Annual Conference of the Council for Programs in Technical and Scientific Communication.

Abstract: Reshaping the curriculum to prepare students, foster Ph.D. professional development, capitalize on program strengths, balance theory and practice, maintain legitimacy, and provide opportunities for intradisciplinary research.

Zimmerman, D. E., & Long, M. (1993). Exploring the technical communicator’s roles: Implications for program design. Technical Communication Quarterly, 2(3), 301–318.

Abstract: Suggested guidelines and implications for program development in technical communication based on professional roles and demographic data of technical communicators.

Miscellaneous

Kunz, L. D. (1995). Learning and success in technical communication education. Technical Communication, 42(4), 573–575.

Abstract: Success of technical communication programs depends on the success of the graduates. Graduates’ success depends on the students’ commitment to lifetime learning.