<<

Paper ID #8477

Technical Reviews in Capstone

Dr. Gene Dixon, East Carolina University

Gene Dixon is a tenured Associate Professor at East Carolina where he teaches aspiring engineers at the undergraduate . Previously he has held positions of responsibility in industry with Union Carbide, Chicago Bridge & Iron, E.I. DuPont & deNemours, Westinghouse Electric, CBS, Viacom and Wash- ington Group. Positions include engineer, program assessor, senior shift manager, TQM coach, and production reactor outage planner. He received a Ph.D. in Industrial and and Engineering Management from The University of Alabama in Huntsville, a Masters of Business Admin- istration from Nova Southeastern University and a Batchelor of Science in Materials Engineering from Auburn University. He has authored several articles on follower component of leadership and is active in research concerning energy, engineering education, and leadership processes. He has served as newsletter editor/secretary, program chair, division chair and awards chair (or equivalent) in both the Engineering Management and Engineering Economy Divisions of ASEE. He is a fellow of the American Society of Engineering Management and serves as Secretary of the Society and will be president in 2015. Dixon also serves on the Eugene L. Grant Award Committee for the Engineering Economy Division of ASEE. He has been active in leading capstone , capstone courses and industry-community relations for eight years. Page 24.1178.1

c American Society for Engineering Education, 2014 Technical Design Reviews in Engineering Capstone Abstract

Technical design reviews are used throughout industry to assess, question, improve and approve design. The review process is a frank exploration of the design efficacy. This paper reports on the use of the industrial style technical or its equivalent in the engineering capstone. The technical design review can be used as an informative lecture topic or applied to the capstone project. Based on survey data, this paper reports on how, when and to what extent technical design reviews are used by respondents. Sixty-seven respondents from academia provided input. The data reported in this paper represent a qualitative exploration of the use of technical design reviews (TDRs) directed at determining what characteristics are valued for TDRs in engineering capstone program. The results are limited by the qualitative design of the research. A more structured quantitative approach would require higher demands on participants time and effort which was beyond the scope of this initiative.

Introduction

Main1 has defined design as “…a creative process that generates new and unique solutions to ever-changing customer demands.” Engineering capstone design courses are recognized as “…a culminating experience” where students apply “…knowledge and abilities to practical engineering problems.”2. The capstone experience permits students to connect theory and practice in the final academic process of developing professional skills of design and personal relationships through teamwork. Capstone texts each have variations of the design process such as stage-gate, and systems engineering lifecycle; however, no consensus on what specifically constitutes engineering design was found3. These variations include general references to TDRs.

Duesing, et al., 4 has described the philosophy behind engineering capstone courses to prepare the engineering student for work in industry. That philosophy implies that capstone is seen as a transition from academia to engineering practice. As the engineering students’ capstone experience is marketed, vetted5, and assessed, there seems to have been modest quantities of work reported concerning the development of the use of TDRs within capstone programs as a process or pedagogy. Lectures, handouts, guidebooks and textbooks have modest offerings on developing student abilities regarding the process of TDRs. A review of research literature provides little pedagogy or methodologies for developing knowledge, skills and abilities (KSAs) within students that are useful in preparing them for TDRs on campus and post-graduation.

Context

Practicing engineers and technical professionals have design experience that goes beyond technical design that include6: design priorities including proper design tools and methods; economic analysis; importance of non-technical issues involving marketability, legal issues, codes and standards, product safety, environmental issues, etc.; and, design reviews. Design reviews are recognized to have two objectives, 1) identify deficiencies or problems with the presented design, and 2) improve the design6. Page 24.1178.2 In industry, design reviews are used to avoid expensive change-orders by having sponsors, champions or owners verify that the design product meet constraints and objectives, and improvement design7. The design review process is the integration of owner perspective, perspective, and project requirements8. The complexity of this tri-party process means that engineers will neither be supplied with, nor be able to develop, a comprehensive codification of absolutes that apply universally across all design reviews. However, this complexity indicates the need for frequent and continual review-related regarding the progress and efficacy of design as it develops.

The complexity of the basis of the design reviews, the tri-party integration, cannot diminish the need for design reviews that ensure meet necessary before being released to the next phase of development1. Work by Ferguson and Sanger9, Adams10, and Gharabagi and Ebel11 identify commonly used industry phase reviews:

Review (CDR): a review of design alternatives, selection, and specification of codes, standards and regulation compliance. This review would typically focus on the engineering design specifications with the objective of ensuring that the design meets all criteria, or that compromises are appropriate and required. • Preliminary Design Review (PDR): a review to determine that materials and components are properly allocated and resourced. • Final (Critical) Design Review (FDR): sometimes referred to as a detailed design review (DR), this technical review is used to detect and resolve design errors and omissions that may include a failure mode and effects analysis (FMEA). • Release to Test/Build (RTB): a review that verifies issues identified in the CDR have been appropriately resolved, test plans are developed and safety/hazards analysis are completed. This review focuses on user and builder hazard minimization and may involve iteration to an earlier review phase.

Main1 makes the point that for an efficient design process, the most reviews are the early ones. Design inadequacies that are unidentified, unchecked or unresolved until later phases often result in costly design modification. As the design develops the cost of making and implementing changes increase significantly.

It is important for and design reviewers to recognize the importance of TDR process. The design review is a criticism centric process, i.e., a search for errors and omissions. When reviewer and designer are supportive of this criticism centric process, they must remain emotionally detached from the design in order to support a value-added discourse leading to robust design12. When conducted with the appropriate diligence and attitude, the design is improved. D’Astous13 states that design reviews are a dialogue that can improve design and therefore justify a negative evaluation of a design product. The dialogue required is based on “trustful ” that requires management support and recognition of the importance of the work involved with the TDR process14.

Bond15 suggests that the potential for improvement necessitates that TDRs be conducted as

soon as technically feasible solutions are identified and continue until the RTB phase is Page 24.1178.3 complete. For Bond, the premise is that the ongoing design reviews result in additional and refined alternatives that involve a larger group of people, i.e., more involvement improves design. Bond also recognizes that judgment is often superior to analytical tools and cannot be replaced. Experience is a recognized source for sound judgment.

The cost of design defects can be significant. Kemerer and Paulk7 report on research that found that a requirements defect can cost 100-200 times as much to repair than it would have cost to resolve during CDRs or PDRs. Early design defect detection can avoid construction identified change notices (6-23% of original estimate), as well as reduced maintenance and operating costs throughout the life designed product. Also, liability insurers recognize the value of design reviews in reducing the errors and omissions claims and litigation15. Kinnersley and Roelen16 claim that up to 60% of root causes of accidents stem from design deficiencies. Taylor17 concludes that design reviews have been estimated to discover and remove between 80% and 95% of the errors made during the design process.

Capstone TDR Process

The literature of capstone TDR describes variations on a generally applied industry approach using TDRs associated with design gates or phases as described above. Industry approaches are described in various standards (e.g., IEEE Std. 1028-1997) appropriate for the design application. The TDR process is a peer evaluation of a design as it is developed and/or before it is deployed for development, fabrication or production. Peer reviewers are acquired from independent pools in order to provide experienced, unbiased, and objective design oversight19.

Similar to spaced TDRs in industry, Wilson, Cambron, and McIntyre20 describe a capstone TDR process that requires reviews throughout the year where students are used as independent reviewers (see also, 4). At the my university, faculty provide independent reviews throughout the year and at the end of each semester a panel of industry representatives performs a review of written design documents for technical adequacy (checksheets available on request). These spaced reviews approximate the CDR, PDR, and FDR described above. Archibald, Reuber, and Allison6 report on the use of three to five professionals in performing reviews. For capstone projects, the quality of the TDR is dependent on students’ ability to communicate design objectives and solutions. This parallels industry practices. A professional design presentation aids the efficiency and effectiveness of the design review process. A design presentation should include the appropriate analysis of prior art, customer objective(s), customer requirements, design economics, drawings, analytical results, engineering changes, test reports, and an open issues list4. Patent search results may also be included. As the design develops, the presentation should provide insights into design activities, design alternatives considered and selected, technical/economic trade-off analysis and justifications, and conclusions21.

Often the TDR process includes oral presentations. During oral presentations, design assumptions, analysis, alternatives and are challenged during question and answer (Q&A) portions of the TDR. Duesing4 (2004) states that it is “…critical that engineers explain their concepts and designs to an engineering and management audience.” as well as, “…answer questions regarding their concepts and designs for these same audiences.” The Q&A portion of the TDR requires students to prepare for, and anticipate questions. One of the critical

learning opportunities of the Q&A process has been for students to learn the difference between Page 24.1178.4 defending design and arguing for their design. This “delicate skill” requires students to hone their KSAs of communicating their work and being sensitive to reviewers’ insights for improving their work.

Methods

A simple questionnaire was developed and distributed to the capstone community. The survey was distributed electronically to two groups. Survey 1 was sent to a broad population of capstone related constituents including ECU Engineering alumni, ECU Engineering Advisory Board members and recent National Capstone Conference attendees (N~600). Survey 2 was sent to the American Society for Engineering Education’s Design in Engineering Education Division through a list_serve (N~800). Both surveys requested participants to forward the survey link to other capstone community members. Participants included capstone instructors/ coordinators (n=60), and capstone sponsors as well as other industry representatives (n=4). The low industry participation does not support comparative analysis between the industry sponsors and representatives of academia. Table 1 shows the participant categories and response levels.

Table 1: Respondent Capstone Roles

Survey 1: General Survey 2: DEEDS Total

Capstone Constituency Members Respondents Academic: instructor, adviser, 33 87% 27 93% 60 coordinator, etc. Industry: sponsor, supporter, 4 11% 0 0% 4 SME adviser, etc. Other* 1 3% 2 7% 3 *Survey 1: retired instructor. Survey 2: team advisor and sponsor; program director

The survey was designed to be brief and general. Because of the desire for low-impact on participants, few demographic questions were asked (see Appendix 1). The survey included a Likert scale question containing literature identified characteristics and components TDRs in which respondents were asked to assign value on a 4 point scale of doesn’t matter (1) to must have (4). Additional was provided for other preferred alternatives. Qualitative questions addressed problem statement precision. Examples of exemplary TDRs were requested as well as reasons why the TDRs were considered exemplary.

Results

Participants were asked if the programs with which they were affiliated included a TDR process. Results for the two surveys are provided in Table 2. Approximately 90% of all programs represented in the survey included some TDR processes in some or all of their engineering or engineering technology capstones.

Participants were asked how frequently TDRs were administered during the capstone semester(s). Table 3 shows the results. Most capstone programs (42%) included multiple TDRs Page 24.1178.5 per semester. Respondents were given an option to explain variations in the frequency of capstone TDRs. Relevant responses are found in Table 4. In general, variations can be categorized as project/program specific and are related to needs and objectives of the program design.

Table 2: Capstones containing TDR processes

Survey 1: General Survey 2: DEEDS Total % Capstone Constituency Members All capstone programs include a 28 22 50 75 technical design review Some programs include them 5 5 10 15 No 5 2 7 10

Table 3: Frequency of TDRs within Capstone.

Survey 1: General Survey 2: DEEDS Total % Capstone Constituency Members 1 per project lifecycle 5 4 9 16 1 per semester 6 4 10 18 2 per semester 11 8 19 35 >2 per semester 7 4 11 21 Other, please explain 6 7 13 24

Participants were asked to identify essential characteristics or components of TDRs for capstone application. The low response of industry representatives prohibited comparative analysis. A list of recognized components/characteristics was supplied in the survey (Appendix 1). A Likert response was requested for the survey provided characteristic/components. These provided characteristics/components were developed from various literature sources, industry examples of TDR checklists and the author’s anecdotal experience from capstone and from a lengthy industry career. The Likert scale requested ratings on the order of: 1, doesn’t matter; 2, may be good to have; 3, nice to have; and 4, must have. For the provided TDR components, the responses indicated above mean affinity for including the component as an essential part of the TDR. Components and sample means are shown in Table 5

Respondents were also provided space to include additional characteristics and/or components. For Sample 1, General Capstone Constituency additional items included: manufacturability, consideration of alternative materials and processes, aesthetics, ergonomics, safety of operator(s), quality assurance, testability, and reliability. For Sample 2, Survey 2: DEEDS Members, additional characteristics/components included: assessment against requirements, cost estimate, demonstration of the device performance, description of 5 alternatives, economic impact of project, business ROI, expected performance vs. spec, functional block diagram of , future work needed or desired, market/user assessment of design vs. competition or consumer choices, project plan assessment, project plan for completing build & test phase, Page 24.1178.6 remaining issues/barriers, safety, selection of material and manufacturing process described, technology roadmaps, technical risk assessment, tolerance issues considered, and understanding of physical requirements of device. While there is divergence between the two sample response sets, there is also evidence of programs that broaden defining the TDR process to include reviews. This is not meant to be a judgmental observation as capstone programs defined programmatic outcomes based on their pedagogical designs.

Table 4: Explanations of Variations in Capstone TDR frequency.

• It varies from 1 per project lifecycle to every month. • Frequent, interactive progress reviews during the semester. • Ad hoc, part of monthly orals. • I run computer science capstone projects where the project mentors in industry constantly review the technical designs (req./design/code/test). • As needed with advisor. • This technical review is performed at the end of the semester, and included in the final design report. • 2-semester course; fall has a preliminary design review conducted by sponsoring company or agency and a system level design review that is a public event (PPT presentation) often coupled with a private review with the sponsoring company/agency; spring semester includes 2 Qualification Review Board reviews where teams present to expert faculty panel whose job is to find flaws and provide advice for corrective action; final design review includes a public PPT briefing, poster and demonstration session--often accompanied by private review session at sponsor site. • Mid-term review every 2 years. • 2-term capstone, 1st term has two design reviews and 2nd term shifts to build and test. • Tollgate based process with a total of 5 reviews associated with the tollgates. • There is one per semester and it is during the sixth week of class so it is not comprehensive. • Always once sometimes twice; once tends to be report at the end of the term NOT a real review.

Participants were provided an opportunity to share thoughts on the TDR process and its components. Specifically input was sought on indicators of precision and quality associated with effective capstone TDRs as well as examples of exemplary TDR processes. Analysis of this data will be the subject of a follow-on paper.

Limitations

As stated above, the survey was designed to be brief and general and have only a low-impact on participants in terms of inputs requested and time for responding. By design, therefore few demographic questions were asked (see Appendix 1). Questions that address demographics like Page 24.1178.7 number of institutions invited to, and participating in the survey have not been addressed. This limits the analysis to only a coarse assessment of the samples collected. The lack of participation from industry also confounds the ability to mine the data for trends and common practices.

Table 5: Essential TDR Components/Characteristics of a Capstone TDR

Survey 1: General Survey 2: DEEDS

Capstone Constituency Members Total Total Mean Mean Responses Responses input selection complete and correct 34 3.62 28 3.71 assumptions adequately described and reasonable 35 3.80 28 3.82 appropriateness of design methods and computer aids 35 3.23 28 3.50 design inputs correctly incorporated into design 35 3.83 28 3.71 reasonableness of design outputs relative to design 35 3.63 28 3.75 inputs design inputs for interfacing systems specified 35 3.40 28 3.21 materials, parts, process and inspection/testing criteria 35 3.37 28 3.36 appropriate hazards adequately/accurately assessed for the design 35 3.26 28 3.07 and the interface(s) with supporting systems engineering judgments identified, technically justified 35 3.57 28 3.71 and supported. regulatory requirements properly assessed and 35 3.11 27 3.11 addressed environmental issues appropriately addressed including 35 2.86 28 2.68 sustainability operability, maintainability, recyclability addressed 35 2.63 28 3.04 appropriately

Conclusions

General conclusions are not drawn from this limited study as a full understanding of the relationship between programmatic needs and TDR execution is not evident from the sample data. What is needed is a TDR-like process to understand how both industry and academia can benefit aspiring engineers in fostering a dynamic TDR curricular component for capstone. TDRs, in order to be broadly useful in capstone curricular, have to be flexible enough to accommodate a range of design applications from basic research to product/process development and implementation across a variety of applications from bio-process to aerospace, agriculture, computer science or . In a paper, a TDR of TDRs is still somewhat illusive. By the time this paper is published, it is expected that some discussion will occur on this area of capstone at the bi-annual Capstone Conference (http://www.capstoneconf.org/).

Page 24.1178.8

Bibliography

1. Main, Bruce W. (2002). Design Reviews: Checkpoints for Design, Professional Safety, v47n1, January.

2. Gerlick, Robert, Denny Davis, Steven Beyerlein, Jay McCormack, Phillip Thompson, Olakunle Harrison, Michael Trevisan, (2008). Assessment Structure and Methodology for Design Processes and Products in Engineering Capstone Courses, Proceedings of the 2008 American Society for Engineering Education Annual Conference & Exposition, American Society for Engineering Education, American Society for Engineering Education, Pittsburgh.

3. Carberry, Adam R., Hee-Sun Lee, and Matthew W. Ohland (2010). Measuring Engineering Design Self- efficacy, Journal of Engineering Education, v99n1, pg 71-79.

4. Duesing, Paul, David Baumann, David McDonald, Morrie Walworth, Robert Andersen (2004). Learning and Practicing The Design Review Process In Senior Capstone Design Classes, Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition, American Society for Engineering Education Salt Lake City.

5. Kauffman, Paul, and Gene Dixon (2011). Vetting Industry Based Capstone Projects Considering Outcome Assessment Goals, International Journal of Engineering Education, v27n6, p1231-1237.

6. Archibald, Mark, Mark Reuber, Blair Allison (2002). Reconciling Well-Defined Capstone Objectives and Criteria with Requirements for Industry Involvement, Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition, American Society for Engineering Education, Montreal.

7. Kemerer, Chris F., Mark C. Paulk (2009). The Impact of Design and Code Reviews on Software Quality: An Empirical Study Based on PSP Data, IEEE Transactions on , v35n4, July/August.

8. Carlin, Bennett C. (2010). The design Review Process, ASHRAE Journal, June.

9. Ferguson, Chip W, Phillip A. Sanger (2011). AC 2011-204: Facilitating Student Professional Readiness Through Industry Sponsored Senior Capstone Projects, Proceedings of the 2011 American Society for Engineering Education Annual Conference & Exposition, American Society for Engineering Education, Vancouver.

10. Adams, P. (1999). Application in General Industry, Safety Through Design, W. Christensen and F. Manuele, eds. Itasca Il: NSC Press.

11. Gharabagi, Roobik, William J. Ebel (2003). The Senior Design Project: From Concept to Reality, Proceedings of the 2003 American Society for Engineering Education Annual Conference & Exposition, American Society for Engineering Education, Nashville.

12. Colwell, Bob (2000). Design Reviews, Computer, V36n10.

13. D’Astous , Patrick, Francoise De´tienne, Willemien Visser, Pierre N. Robillard (2004). Changing Our View on Design Evaluation Meetings Methodology: a Study of Software Technical Review Meetings, 25, p625–655.

14. Falk, Thomas, Carl Rollenhagen, Björn Wahlström (2012). Challenges in performing technical safety reviews of modifications – A case study, Safety Science 50, 1558–1568. Page 24.1178.9 15. Bond, Robert H. (1998). Leadership Training - A Different Look at Design Courses, Proceedings of the 1998 American Society for Engineering Education Annual Conference & Exposition, American Society for Engineering Education, Seattle. 16. Kirby, Jeffrey G. Douglas A. Furry, and Donald K. Hicks (1988). Improvements in Design Review Management, Journal of Construction Engineering Management, 114:69-82.

17. Kinnersley, K., Roelen, A., 2007. The contribution of design to accidents. Safety Science v45, 31–60.

18. Taylor, Robert J., 2007. Statistics of Design Error in the Process Industries. Safety Science 45, 61–73.

19. Bruce, J.W. (2004). Design Inspections and Software Product Metrics in an Embedded Course, Proceedings of the 2004American Society for Engineering Education Annual Conference & Exposition, American Society for Engineering Education, Salt Lake City.

20. Wilson, Stacy S., Mark E Cambron, Michael L. McIntyre (2011). Assessing and Improving a Capstone Design Sequence With Industrial Project Management Techniques Proceedings of the 2011 American Society for Engineering Education Annual Conference & Exposition, American Society for Engineering Education, Vancouver.

21. Looft, Fred J., Robert C. Labonte (1989) Technical Quality Assurance in Student Design Projects, IEEE Transactions on Education, v32n3 August, 349.

Page 24.1178.10 Appendix 1: Survey Instrument

A brief survey to collect data to be compiled and reported to aid all capstones.

Thank you for participating in this survey on engineering student capstone technical design reviews. I know your time is valuable and I greatly appreciate your input. My purpose is to improve student abilities for creating design.

First, the required notifications: Before you begin, please note that the ECU Institutional Review Board has reviewed the survey and authorized its distribution. Only aggregate data will be shared with broader audiences. Intended audiences, for reporting purposes, are capstone course instructors/administrators. Your privacy will be maintained in all published and written data resulting from the research. Respondent ISPs will reside on a server behind a firewall.

Your participation is entirely voluntary and you have the right to discontinue the survey at any time. You also have the right to refuse to answer specific questions simply by skipping the question. You may abort the entire process at any time.

Now the survey: While there only 5 simple questions requesting data, the intent is to offer a free-ranging opportunity for capstone practitioners to offer insights, recommend pedagogy or pose issues/questions for further examination. Please feel free to use the spaces provided to expand and/or challenge the body-of- knowledge relative to capstone and specifically capstone problem statements. You can also contact me directly (see below) to discuss this research or the data requested.

Industry participants If you are a participant representing an industry, please generalize the questions for your particular needs/purposes. Some of the language pertains to the academic setting and is recognized to have a broader application in industry. I look forward to hearing your input from the broader application or industry perspective, e.g., what are your expectations for problem statements within your organization/industry? Thank you

Should you have questions/comments that you wish to share directly, please contact me, Gene Dixon at [email protected] or by telephone 252 737 1031.

Again thank you for participating.

I acknowledge that all data collected will be kept confidential unless compiled in a way that fully shields traceability to its source. Further, I acknowledge that I am a voluntary participant. ______Yes. Please continue the survey. _____ No. Thank you. Please exit the survey now. _____ Click to write Choice 3

My role in capstone is ______Academic: instructor, adviser, coordinator, etc Page 24.1178.11 _____ Industry: sponsor, supporter, SME adviser, etc. _____ Other (please specify):

For purposes of this survey, a technical design review (TDR) is a deliberate, thoughtful assessment of the capstone project's technical design component. It is not necessarily a design written and oral presentation, rather a design review assesses design output for compliance with input requirements and technical adequacy and accuracy. Hazard assessments are often included in design reviews in certain industries. During a technical design review, the design is assessed for adequacy and potential adverse impacts. A TDR does not include site acceptance testing.

Are formal technical design reviews a part of the capstone program(s) in which you are involved? ______All of the capstone programs in which I am involved include a technical design revi _____ Some programs include them _____ No

If you answered all or some in the previous questions, how frequently are technical design reviews conducted? _____ 1 per project lifecycle _____ 1 per semester _____ 2 per semester _____ >2 per semester _____ Other, please explain

What components or characteristics are essential parts of a capstone level technical design review Doesn’t Maybe good Nice to Must Matter to have have have input selection complete and correct assumptions adequately described and reasonable appropriateness of design methods and computer aids design inputs correctly incorporated into design reasonableness of design outputs relative to design inputs design inputs for interfacing systems specified materials, parts, process and inspection/testing criteria appropriate hazards adequately/accurately assessed for the design and the interface(s) with supporting systems engineering judgments identified, technically justified and supported regulatory requirements properly assessed and addressed environmental issues appropriately addressed including sustainability operability, maintainability, recyclability addressed appropriately Other 1 Other 2 Other 3

Additional comments on characteristics and components of (capstone) technical design reviews:

Page 24.1178.12 What indicators of precision and quality do you find effective in evaluating (capstone) TDRs?

Please give an example of an exemplary TDR processes from your (capstone) project experience. You may enter below or email to [email protected].

Please comment on why the TDR process you have chosen above is considered exemplary.

Instructors: One of the more valuable assets to the capstone experience is industry involvement. Please share this survey link with industry capstone sponsors and ask them to participate. This research will be more valuable when the voice of industry is included. Thank you Page 24.1178.13