Usability and Usability Testing of Websites: an Example Redesign for Sargent Manufacturing

Total Page:16

File Type:pdf, Size:1020Kb

Usability and Usability Testing of Websites: an Example Redesign for Sargent Manufacturing Usability and Usability Testing of Websites: An Example Redesign for Sargent Manufacturing A Major Qualifying Project Submitted to the faculty of Worcester Polytechnic Institute in partial fulfillment of the requirements for the Degree of Bachelor of Science, Professional Writing Submitted By: Joseph Sceviour Submitted on 5/22/10 Submitted To: Project Advisor: Chrysanthe Demetry Email: [email protected] Contents Table of Figures ............................................................................................................................................. 4 Abstract ......................................................................................................................................................... 5 Introduction .................................................................................................................................................. 6 Background ................................................................................................................................................... 8 Perspectives on Usability .......................................................................................................................... 8 Usability Engineering ................................................................................................................................ 9 Web Interfaces ........................................................................................................................................ 10 Cultural Usability ..................................................................................................................................... 11 Discount Usability Engineering ............................................................................................................... 12 Heuristic Evaluation ............................................................................................................................ 14 Scenarios ............................................................................................................................................. 14 User and Task Observation ................................................................................................................. 14 Simplified Thinking Aloud ................................................................................................................... 15 Between Subject vs. Within Subject Testing ....................................................................................... 15 A Case Study: The Sargent Manufacturing Website ............................................................................... 16 Methodology ............................................................................................................................................... 20 Identifying Different Users ...................................................................................................................... 21 Recruiting Testers ................................................................................................................................... 21 Employing the Discount Usability Testing Approach .............................................................................. 22 Initial Heuristic Evaluation .................................................................................................................. 22 User and Task Observation ................................................................................................................. 22 Simplified Thinking Aloud ................................................................................................................... 23 Scenarios ............................................................................................................................................. 23 Heuristic Evaluation of Redesign ........................................................................................................ 24 User and Task Observation of Redesign ............................................................................................. 24 Compare, Repeat as Necessary ........................................................................................................... 24 Usability Problems with the Initial Website................................................................................................ 25 Redesign of the Access Control Section of the Sargent Manufacturing Website ....................................... 31 Reflections on Usability Testing .................................................................................................................. 43 Heuristics................................................................................................................................................. 43 Scenarios ................................................................................................................................................. 43 2 | P a g e User Task Observation ............................................................................................................................ 43 Simplified Thinking Aloud ....................................................................................................................... 44 Cultural Usability Testing ........................................................................................................................ 44 Works Cited ................................................................................................................................................. 46 Appendix 1 .................................................................................................................................................. 47 Testing Introduction Form ...................................................................................................................... 47 Testing Situation # 1 Background ........................................................................................................... 48 Testing Situation # 1 Metrics .................................................................................................................. 48 Testing Situation # 2 Background ........................................................................................................... 49 Testing Situation # 2 Metrics .................................................................................................................. 49 Testing Situation # 3 Background ........................................................................................................... 50 Testing Situation # 3 Metrics .................................................................................................................. 50 Post-Testing Interview Questions ........................................................................................................... 51 3 | P a g e Table of Figures Figure 1 Circuit of Culture (Melles, 2008) ................................................................................................... 12 Figure 2 Nielsen, "Why You Only Need to Test with 5 Users" .................................................................... 13 Figure 3 Sargent Manufacturing's Homepage www.sargentlock.com ....................................................... 17 Figure 4 Methodology Flowchart ................................................................................................................ 20 Figure 5 Sargent Main Page with Jargon Highlighted ................................................................................. 25 Figure 6 Poor Image Recognition on Current Sargent Site ......................................................................... 28 Figure 7 No Common Information Section ................................................................................................. 29 Figure 8 No Common Information Section (continued) ............................................................................. 29 Figure 9 Titles Looking Like Links ................................................................................................................ 30 Figure 10 Icon Users Cannot Click ............................................................................................................... 30 Figure 11 Old Access Control Page ............................................................................................................. 31 Figure 12 New Access Control Page ............................................................................................................ 32 Figure 13 Product Comparison ................................................................................................................... 34 Figure 14 Product Line Sub-Page ................................................................................................................ 36 Figure 15 Hover-Over Text .......................................................................................................................... 38 Figure 16 Product Line and Hardware Combination Page .......................................................................... 38 Figure 17 Levers, Finishes, and Options ...................................................................................................... 40 Figure 18 Product Identifier (Top) .............................................................................................................. 41 Figure 19 Product Identifier (Bottom) .......................................................................................................
Recommended publications
  • Human Performance Regression Testing
    Human Performance Regression Testing Amanda Swearngin, Myra B. Cohen Bonnie E. John, Rachel K. E. Bellamy Dept. of Computer Science & Eng. IBM T. J. Watson Research Center University of Nebraska-Lincoln, USA P.O. Box 704 Lincoln, NE 68588-0115 Yorktown Heights, NY 10598, USA faswearn,[email protected] fbejohn,[email protected] Abstract—As software systems evolve, new interface features common tasks, necessitating longer mouse movements and, in such as keyboard shortcuts and toolbars are introduced. While fact, decrease efficiency. In addition, many toolbars with small it is common to regression test the new features for functional icons may add screen clutter and may decrease a new user’s correctness, there has been less focus on systematic regression testing for usability, due to the effort and time involved in human ability to discover how to accomplish a task over a simpler studies. Cognitive modeling tools such as CogTool provide some UI design. help by computing predictions of user performance, but they still Usability testing has traditionally been empirical, bringing require manual effort to describe the user interface and tasks, end-users in to a testing facility, asking them to perform tasks limiting regression testing efforts. In recent work, we developed on the system (or prototype), and measuring such things as the CogTool-Helper to reduce the effort required to generate human performance models of existing systems. We build on this work time taken to perform the task, the percentage of end-users by providing task specific test case generation and present our who can complete the task in a fixed amount of time, and vision for human performance regression testing (HPRT) that the number and type of errors made by the end-users.
    [Show full text]
  • Cognitive and Usability Engineering Methods for the Evaluation of Clinical Information Systems
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Elsevier - Publisher Connector Journal of Biomedical Informatics 37 (2004) 56–76 www.elsevier.com/locate/yjbin Methodological Review Cognitive and usability engineering methods for the evaluation of clinical information systems Andre W. Kushniruka,* and Vimla L. Patelb a Information Technology Division, Department of Mathematics and Statistics, York University, Toronto, Canada b Laboratory of Decision Making and Cognition, Department of Biomedical Informatics, Columbia University, New York, NY Received 29 September 2003 Abstract Increasingly healthcare policy and decision makers are demanding evidence to justify investments in health information systems. This demand requires an adequate evaluation of these systems. A wide variety of approaches and methodologies have been applied in assessing the impact of information systems in health care, ranging from controlled clinical trials to use of questionnaires and interviews with users. In this paper we describe methodological approaches which we have applied and refined for the past 10 years for the evaluation of health information systems. The approaches are strongly rooted in theories and methods from cognitive science and the emerging field of usability engineering. The focus is on assessing human computer interaction and in particular, the usability of computer systems in both laboratory and naturalistic settings. The methods described can be a part of the formative evaluation of systems during their iterative development, and can also complement more traditional assessment methods used in summative system evaluation of completed systems. The paper provides a review of the general area of systems evaluation with the motivation and rationale for methodological approaches underlying usability engineering and cognitive task analysis as applied to health in- formation systems.
    [Show full text]
  • Isys Information Architects Inc
    Isys Information Architects Inc. Home Design Announcements Hall of Shame Isys Information Architects Inc. specializes in the design and development of New Entries robust, highly usable information systems. Isys focuses on ease of use, recognizing Controls that software should assist the user in the performance of some task rather than Stupidity becoming a task in itself. Isys was founded by Brian Hayes, a former professor of Visual Color Industrial Engineering, and system design consultant whose clients have included Terminology AT&T, General Electric, General Motors, Lucent Technologies, NASA, Siemens, Errors the U.S. Air Force, and the U.S. Army. Tabs Metaphors We are a full-service Our software development Globalization provider of user interface department specializes in the In-Depth design and usability design and development of engineering services. Our careful robust, highly usable information Books systems for the Windows operating Links attention to the user interface can system. We provide custom Index reduce system development time, Feedback increase user satisfaction, and reduce programming, database, and training costs. application development services. Hall of Fame Contact Us The Interface Hall of Shame The Interface Hall of is an irreverent look at Fame is a collection of ineffective interface design. images that demonstrate The site includes a collection of interface design solutions that are images from commercial, corporate, both creative and effective. and shareware applications that illustrate how software should not be designed. Isys Information Architects Inc. 5401 McNairy Drive, Suite 100 Greensboro, NC 27455 [email protected] © 1996-1999 Isys Information Architects Inc. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission is prohibited.
    [Show full text]
  • Usability Basics for Software Developers Usability Engineering
    focususability engineering Usability Basics for Software Developers Xavier Ferré and Natalia Juristo, Universidad Politécnica de Madrid Helmut Windl, Siemens AG, Germany Larry Constantine, Constantine & Lockwood n recent years, software system usability has made some interesting ad- vances, with more and more organizations starting to take usability seri- ously.1 Unfortunately, the average developer has not adopted these new I concepts, so the usability level of software products has not improved. Contrary to what some might think, us- Usability engineering defines the target us- This tutorial ability is not just the appearance of the user ability level in advance and ensures that the examines the interface (UI). Usability relates to how the software developed reaches that level. The system interacts with the user, and it includes term was coined to reflect the engineering ap- relationship five basic attributes: learnability, efficiency, proach some usability specialists take.3 It is between usability user retention over time, error rate, and sat- “a process through which usability character- and the user isfaction. Here, we present the general us- istics are specified, quantitatively and early in interface and ability process for building a system with the the development process, and measured desired level of usability. This process, which throughout the process.”4 Usability is an is- discusses how most usability practitioners apply with slight sue we can approach from multiple view- the usability variation, is structured around a design- points, which is why many different disci- process follows a evaluate-redesign cycle. Practitioners initiate plines, such as psychology, computer science, design-evaluate- the process by analyzing the targeted users and sociology, are trying to tackle it.
    [Show full text]
  • Introducing Usability Engineering and Universal Design in the Computer Science Curriculum
    Introducing Usability Engineering and Universal Design in the Computer Science Curriculum Authors: Kirsten Ribu, Oslo University College, Postboks 4, St. Olavs plass, 0130 Oslo, [email protected] Nes, Avenir, [email protected] Abstract ⎯ Software and information systems are usually developed for a stereotypical user: Young, white, male and non- disabled. In order to understand and include a variety of user groups, universal design and usability engineering must become part of the software engineering curriculum. Learning to develop systems for diverse user groups means that students must acquire knowledge of the relations between humans and technological solutions, as well as the requirements of disabled users. In order to communicate the importance and meaning of accessibility and usability, classroom exercises like using a computer with one hand, surfing the Internet wearing sunglasses or glasses smeared with oil or grease, and distorted computer screens can illustrate the challenges of designing software solutions for all users. We describe how students can be made aware of the social importance of software engineering, and conscious of professional and democratic responsibilities. The paper outlines suggestive curriculum, defining the major disabled user groups and their accessibility and usability requirements, and describes how to approach teaching universal design and usability. At Oslo University College, a new bachelor program called ‘Applied Computer Technology’ was initiated in 2005. The main focus of this program is the practical use of information systems in specific contexts, and the principles and ideas outlined in this paper will be incorporated into the third year course 'Modern Methods for Systems Development'. The students will work on real projects, developing solutions for real users.
    [Show full text]
  • 1. Can You Explain the PDCA Cycle and Where Testing Fits In?
    1. Can you explain the PDCA cycle and where testing fits in? Software testing is an important part of the software development process. In normal software development there are four important steps, also referred to, in short, as the PDCA (Plan, Do, Check, Act) cycle. Let's review the four steps in detail. 1. Plan: Define the goal and the plan for achieving that goal. 2. Do/Execute: Depending on the plan strategy decided during the plan stage we do execution accordingly in this phase. 3. Check: Check/Test to ensure that we are moving according to plan and are getting the desired results. 4. Act: During the check cycle, if any issues are there, then we take appropriate action accordingly and revise our plan again. So developers and other stakeholders of the project do the "planning and building," while testers do the check part of the cycle. Therefore, software testing is done in check part of the PDCA cyle. 2. What is the difference between white box, black box, and gray box testing? Black box testing is a testing strategy based solely on requirements and specifications. Black box testing requires no knowledge of internal paths, structures, or implementation of the software being tested. White box testing is a testing strategy based on internal paths, code structures, and implementation of the software being tested. White box testing generally requires detailed programming skills. There is one more type of testing called gray box testing. In this we look into the "box" being tested just long enough to understand how it has been implemented.
    [Show full text]
  • Orthogonal Array Application for Optimized Software Testing
    WSEAS TRANSACTIONS on COMPUTERS Ljubomir Lazic and Nikos Mastorakis Orthogonal Array application for optimal combination of software defect detection techniques choices LJUBOMIR LAZICa, NIKOS MASTORAKISb aTechnical Faculty, University of Novi Pazar Vuka Karadžića bb, 36300 Novi Pazar, SERBIA [email protected] http://www.np.ac.yu bMilitary Institutions of University Education, Hellenic Naval Academy Terma Hatzikyriakou, 18539, Piraeu, Greece [email protected] Abstract: - In this paper, we consider a problem that arises in black box testing: generating small test suites (i.e., sets of test cases) where the combinations that have to be covered are specified by input-output parameter relationships of a software system. That is, we only consider combinations of input parameters that affect an output parameter, and we do not assume that the input parameters have the same number of values. To solve this problem, we propose interaction testing, particularly an Orthogonal Array Testing Strategy (OATS) as a systematic, statistical way of testing pair-wise interactions. In software testing process (STP), it provides a natural mechanism for testing systems to be deployed on a variety of hardware and software configurations. The combinatorial approach to software testing uses models to generate a minimal number of test inputs so that selected combinations of input values are covered. The most common coverage criteria are two-way or pairwise coverage of value combinations, though for higher confidence three-way or higher coverage may be required. This paper presents some examples of software-system test requirements and corresponding models for applying the combinatorial approach to those test requirements. The method bridges contributions from mathematics, design of experiments, software test, and algorithms for application to usability testing.
    [Show full text]
  • Cost Justifying Usability a Case-Study at Ericsson
    Master thesis Computer science Thesis ID:MCS-2011-01 January 2011 Cost justifying usability a case-study at Ericsson Parisa Yousefi Pegah Yousefi School of Computing Blekinge Institute of Technology SE 371 39 Karlskrona Sweden 1 This thesis is submitted to the School of Computing at Blekinge Institute of Technology in partial fulfilment of the requirements for the degree of Master of Science in Computer Sci- ence. The thesis is equivalent to 20 weeks of full time studies. Contact Information: Author(s): Parisa Yousefi Address: Näktergalsvagen 5, Rödeby, 371 30 E-mail: [email protected] Pegah Yousefi Address: Drottninggatan 36, Karlskrona, 371 37 E-mail: [email protected] External advisor(s): Helen Sjelvgren Ericsson Address: Ölandsgatan 6 Karlskrona, 371 23 Phone: +46 10 7158732 University advisor(s): Dr Kari Rönkkö, Ph.D Jeff Winter Licentiate School of Computing Internet : www.bth.se/com Blekinge Institute of Technology Phone : +46 455 38 50 00 SE – 371 39 Karlskrona Fax : +46 455 38 50 57 Sweden 2 Abstract Usability in Human-computer Interaction and user- centered design is indeed a key factor, in success of any software products. However, software industries, in particular the attitude of revenue management, express a need of a economic and academic case, for justification of usability discipline. In this study we investigate the level of usability and usability issues and the gaps concerning usability activities and the potential users, in a part of charging system products in Ericsson. Also we try identifying the cost-benefit factors, usability brings to this project, in order to attempt 'justifying the cost of usability for this particular product'.
    [Show full text]
  • SSW 567 Software Testing, Quality, and Maintenance
    School of Systems and Enterprises Stevens Institute of Technology Castle Point on Hudson Hoboken, NJ 07030 SSW 567 Software Testing, Quality, and Maintenance PURPOSE: This memorandum provides the Syllabus for SSW 567 TEXTS: • A Practitioner’s Guide to Software Test Design by Copeland - Artech House [PG] • Software Testing: A Craftsman’s Approach – Jorgensen [ST] • Software Measurement and Estimation by Laird – Wiley (optional) [SME] SOFTWARE: • Open Office/Microsoft Office • Possibly the ATM Simulation software system by Russell C. Bjork [ATM] • Freeware tools for Configuration Mgmt (e.g., Subversion), Test Mgmt (e.g., Testopia) and Defect Mgmt Tool (e.g., Bugzilla, Mantis) COURSE DESCRIPTION: This course is an in-depth study of software testing and quality throughout the development process. The successful student will learn to effectively analyze and test software systems using a variety of methods as well as to select the best-fit test case design methodologies to achieve best results. There is extensive hands-on project work in this course. Emphasis is placed on testing as technical investigation of a product undertaken to determine the system quality and on the clear communication of results, both of which are crucial to success as a tester. COURSE PREREQUISITES: Working knowledge of C, C++, or Java (or some other programming language). Extensive programming will not be required. COURSE OUTLINE SSW 567 School of Systems and Enterprises Stevens Institute of Technology Castle Point on Hudson Hoboken, NJ 07030 Readings in MODULE DESCRIPTION PROJECT Texts Definitions of Testing, QA, Install configuration Lecture 1: Getting [PG] ch. 1, 15, 16; faults, failures & their Costs; mgmt tools, write Started [ST] Ch 1 & 2 Fundamental Issues of testing initial program Lecture 2: SCM Concepts, Processes, [ST] Ch 3: This is Test initial program - Configuration and Tools Continuous background seeded bugs Management Integration.
    [Show full text]
  • The Role of Usability in Software Validation - Case: Medical Device Manufacturing
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Aaltodoc Publication Archive Aalto University School of Science Degree Programme in Computer Science and Engineering Emmi Kaivosoja The Role of Usability in Software Validation - Case: Medical Device Manufacturing Master’s Thesis Espoo, October 1, 2015 Supervisor: Professor Marko Nieminen, Dr.Sc. (Tech) Thesis advisor: Ninna Koivunen, M.Sc. Aalto University, P.O. BOX 11000, 00076 AALTO www.aalto.fi Abstract of master's thesis Author Emmi Kaivosoja Title of thesis The Role of Usability in Software Validation – Case: Medical Device Manufacturing Degree programme Degree Programme in Computer Science and Engineering Thesis supervisor Prof. Marko Nieminen, Dr.Sc. (Tech) Code of professorship T-121 Department Department of Computer Science Thesis advisor(s) Ninna Koivunen, M.Sc. Date 1.10.2015 Number of pages 9+77+2 Language English Abstract This master’s thesis follows a software project in a medical device company. The software is a manufacturing execution system (MES) which guides the production of medical devices and collects traceability data, including the device history files. Because the software is used in the manufacturing of medical devices and in the implementation of the company quality system, the software needs to be validated for its intended use. The validation process must comply with the guidance and regulations of the Food and Drug Administration. The software system is implemented by a software supplier but the customer organization is responsible for the validation process. In addition to fulfilling the regulatory requirements, the software is expected to increase productivity.
    [Show full text]
  • Flash Usability Report
    Website Tools and Applications with Flash DESIGN GUIDELINES BASED ON USER TESTING OF 46 FLASH TOOLS By Hoa Loranger, Amy Schade, and Jakob Nielsen Additional research by: Rolf Molich Kara Pernice 48105 WARM SPRINGS BLVD. FREMONT, CA 94539-7498 USA WWW.NNGROUP.COM Copyright © Nielsen Norman Group, All Rights Reserved. To get your own copy, download from: http://www.nngroup.com/reports/website-tools-and-applications-flash/ About This Free Report This report is a gift for our loyal audience of UX enthusiasts. Thank you for your support over the years. We hope this information will aid your efforts to improve user experiences for everyone. The research for this report was done in 2013, but the majority of the advice may still be applicable today, because people and principles of good design change much more slowly than computer technology does. We sometimes make older report editions available to our audience at no cost, because they still provide interesting insights. Even though these reports discuss older designs, it’s still worth remembering the lessons from mistakes made in the past. If you don’t remember history, you’ll be doomed to repeat it. We regularly publish new research reports that span a variety of web and UX related topics. These reports include thousands of actionable, illustrated user experience guidelines for creating and improving your web, mobile, and intranet sites. We sell our new reports to fund independent, unbiased usability research; we do not have investors, government funding or research grants that pay for this work. Visit our reports page at https://www.nngroup.com/reports/ to see a complete list of these reports.
    [Show full text]
  • Standard Glossary of Terms Used in Software Testing Version 3.0 Advanced Test Analyst Terms
    Standard Glossary of Terms used in Software Testing Version 3.0 Advanced Test Analyst Terms International Software Testing Qualifications Board Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged. Copyright © International Software Testing Qualifications Board (hereinafter called ISTQB®). acceptance criteria Ref: IEEE 610 The exit criteria that a component or system must satisfy in order to be accepted by a user, customer, or other authorized entity. acceptance testing Ref: After IEEE 610 See Also: user acceptance testing Formal testing with respect to user needs, requirements, and business processes conducted to determine whether or not a system satisfies the acceptance criteria and to enable the user, customers or other authorized entity to determine whether or not to accept the system. accessibility testing Ref: Gerrard Testing to determine the ease by which users with disabilities can use a component or system. accuracy Ref: ISO 9126 See Also: functionality The capability of the software product to provide the right or agreed results or effects with the needed degree of precision. accuracy testing See Also: accuracy Testing to determine the accuracy of a software product. actor User or any other person or system that interacts with the test object in a specific way. actual result Synonyms: actual outcome The behavior produced/observed when a component or system is tested. adaptability Ref: ISO 9126 See Also: portability The capability of the software product to be adapted for different specified environments without applying actions or means other than those provided for this purpose for the software considered.
    [Show full text]