Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. Assistent: Assistent: Betreuer: Wien, at TU at an dieFakultätfürInformatik Okto Wien Inspektion mittelsGUI-Event Heuristikbasierte Usability- ber 10,2019 in dieteilweiseErfüllungderAnforderungenzurErlangungdesGrades Thomas Stefan -00Wien A-1040 T Automatisierung von aber Grechenig Sequenzierung  alpaz13 Karlsplatz

Engineering (Unterschrift der Verfasser (Unterschrift ehiceUiesttWien Universität Technische Diplom Ingenieur Matrikelnummer Amir Diplomarbeit  e .+43-1-58801-0 Tel.

Banaouas

von und in

0927741 Internet ) 

Computing www.tuwien.ac.at ( Unterschrift derBetreuer)

Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. ina coe 0 2019 10, October Vienna, Grechenig Taber Thomas Stefan Assistance: Advisor: Wien TU at Informatics of Faculty the to A Usability utomation submitted -00Wien A-1040 Software Event

in

partial  alpaz13 Karlsplatz

Engineering

Registration Inspection Sgaueo uhr Sgaueo Advisor) of (Signature Author) of (Signature

Master fulfillment ehiceUiesttWien Universität Technische Master’s Amir

of Sequencing  e .+43-1-58801-0 Tel.

of

Banaouas of Number

by and in the Heuristic-based

Science Thesis

requirements

Internet

0927741

 using Computing www.tuwien.ac.at

for

the

degree

GUI of

Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. ina coe 0 2019 10, October Vienna, Assistance: Advisor: at TU at of Faculty the to Industrial for Group Research of Institute at elaborated A Usability -00Wien A-1040 Wien utomation Information SystemsEngineering tfnTaber Stefan Grechenig Thomas submitted the Informatics Software  ide apsr 76/2/2 Hauptstr. Wiedner ehiceUiesttWe,Frcuggup INSO Forschungsgruppe Wien, Universität Technische Event

in

partial

Software Engineering

Registration Inspection

Master fulfillment Master’s Amir

of Sequencing

of

Banaouas  of Number

e .+31572 97 21 +43-1-587 Tel. by and in the Heuristic-based

Science Thesis

requirements

Internet

0927741

using Computing

for  www.inso.tuwien.ac.at the

degree

GUI of

Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. n epue,adta l at fti ok–icuigtbe,mp n grs–i ae from taken if including – labelled figures been have and sense, maps source. by tables, the or of including literally copied citation sources – whether all a work internet, indicated this the completely of from have or I parts works that all other thesis, that this and of used, author sole help the and am I that declare hereby I Wien 1080 28-30/1, Lerchengasse Banaouas Amir Author by Statement Pae Date) (Place, i Sgaueo Author) of (Signature Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. n adwr,adfrcagn ywyo hnigt e e hne n opportunities. and chances new see to discipline thinking about of dream, way a pursuing my with changing for comes for Technology that and of sorrow work, University and hard Vienna joy and the the about to much expressed so (UFT). is Testing me respect Functional teaching and Unified valuing gratitude tool for popular utmost Halperin, their Tal my Finally, for especially license and trial Focus, my Micro extending behind work. and team his the study the of the thank for as version to goes Wimmer final like thanks the would Christoph shape I special particular helped Also, A advice in whose engineering. and Brem usability Brigitte thesis, with Dr. the dealing to reviewed well sections who the on all feedback to important me grateful taught very and am direction right I the in me work. patience pushed my appreciated improve consistently a highly to He his faced how for I schedule. and whenever behind meeting, guidance was a valuable for I his asked whenever I for whenever Taber time Assistant finding to for out challenge, goes gratitude to special Technology. Software very of Industrial A University for Vienna Group the Stefan at Research Assistant Informatics Automation his of Aided and Faculty Grechenig Computer the Thomas of Professor Institute advisor the thesis of my Taber thank to like first would I Acknowledgements ii Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoae sblt etn,atmtdhuitceauto,GItsig vn eunig event- application. sequencing, event desktop testing, Windows GUI driven, evaluation, heuristic automated testing, usability Automated Keywords automation. for suitable guidelines of only consists that it imply so results advance These in value. arranged aesthetic be the can focus as set not such heuristic does tools a and testing interaction GUI from user-system by guidelines one uncatchable among least attributes at noted decrease on requires was greatly evaluation pattern to Their potential a category. Moreover, a first evaluator. indicates the usability This the general. of in work sequencing, automated manual event be with the could automatically them tested of be is 75% practice. could category that heuristics in the and automation first of their the 55% of that from feasibility showed heuristics results the The validate of to sample events, tool a testing sequencing Then, GUI without a automatically testing. in automa- verified evaluated manual automation its be for for can appropriate of suitable that those feasibility Those and those the categories: sequencing, three with event into accordance desktop GUI classified Windows in with are of analysed guidelines context examined then the fit The is to tion. set tailored derived and selected The is set applications. heuristic be existing can technique. an evaluation fields, automation end, heuristic distinct test that of To two a automation sequencing, are the event testing that GUI usability theorized to and thanks is testing improved It greatly GUI other. though each even purpose influence The that of might show automation. level they with to conversant appropriate is highly an field study a sustaining this is for testing of usability GUI required hand, heuristic-based efforts other Automating the the On reduce usability. challenging. change, could be design a partially can after even quickly deteriorate inspection loss might quality (GUI) this Interface User detecting Graphical and a of degree usability The Abstract iii Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. asenHuitkStvrbagode ednkn,s ase u u ihlne eth,die besteht, Richtlinien aus nur es implizieren, dass sind. Ergebnisse geeignet so Automatisierung Diese kann, die werden für angeordnet können. Ihre vorab werden Heuristik-Set erfasst festgestellt. ein nicht Kategorie dass auf Testwerkzeugen nicht ersten von sich der konzentriert die und ein Richtlinien Attribute, Interaktion den auf Benutzer-System eine unter deutet mindestens kann. Muster erfordert Dies werden Bewertung reduziert ein erheblich wurde sind. Usability-Evaluators hinaus des automatisierbar Arbeit Darüber davon manuelle die 75% das allgemein hin, Potenzial und GUI-Event-Sequenzierung der konnten mit automatisch werden Heuristiken getestet der 55% dass zeigten, Praxis Ergebnisse Die der in Kat- Automatisierung ersten ihrer der Machbarkeit Heuristiken die überprüfen. manche um zu werden bewertet, die, Testwerkzeug sind, Anschließend einem und geeignet in sind. können, egorie geeignet Sequenzierung werden Tests überprüft GUI-Event manuelle automatisch mit für Sequenzierung die Automatisierung GUI-Event Kat- die ohne drei die für in diejenigen, Die werden Richtlinien untersuchten unterteilt: Durch- Die Win- egorien der von entsprechend analysiert. Kontext dann Automatisierung den wird seiner an Set führbarkeit resultierende und Der ausgewählt Heuristik-Set angepasst. vorhandener Desktopanwendungen dows ein wird Zweck diesem zu Zu es kann. werden ist Automatisierung- verbessert Arbeit GUI erheblich der Sequenzierung, dieser theo- dank GUI-Event wird Evaluierung, Ziel stechnik Es heuristischen Das können. der beeinflussen Automatisierung gegenseitig die vertraut. jedoch dass sich retisiert, Automatisierung Testen Usability der und mit GUI dass Bereich zeigen, Testen GUI die Andererseits Au- das für verringern. Die Benutzerfreundlichkeit Aufwand ist der den Grad sein. sogar angemessenen schwer könnte Designän- eines Usability-Inspektion kann Aufrechterhaltung einer heuristikbasierten Qualitätsverlustes nach auf dieses der sich Erkennen tomatisierung kann das (GUI) und verschlechtern, Benutzeroberfläche derung grafischen einer Nutzungsgrad Der Kurzfassung iv Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. Introduction 1 Contents uoaigHuitcbsdUaiiyIseto ihGIEetSqecn 58 Sequencing Event GUI with Inspection Usability Heuristic-based Automating 4 udmnaso otaeTesting Software of 6 Fundamentals 3 Engineering Usability and Interfaces User Graphical of Basics 2 . oprn U n sblt vlain...... 58 ...... 37 ...... 31 . . Evaluation . Usability and . . GUI Comparing . . . . 4.1 ...... Evaluation . Usability . . 3.3 ...... 16 ...... Testing . GUI . . 3.2 ...... Testing Software in Basics and Definitions 3.1 . sblt niern 10 . . . . . 6 . . . . . 4 . 4 . . 2 ...... 1 ...... Engineering . . . . . Usability ...... 2.2 ...... Interfaces . . . . User Graphical ...... 2.1 ...... work the . . . of Structure . . . work . . the of 1.4 . . Aim . . . . 1.3 Motivation . Statement 1.2 Problem 1.1 .. eiwo U etAtmto ol 61 ...... 57 . . 59 . . . . 58 ...... 39 . . 50 . . 41 ...... Tools . . . Automation . . Test . . . GUI . . 37 . of . 49 Evaluation . Review Usability . . . . and . Testing . . GUI . . . between Evaluation . . 4.1.3 Differences . . Usability . . and . . Testing . . . . GUI . . between . . . . 4.1.2 Similarities ...... 32 ...... 36 . 4.1.1 ...... 34 ...... 32 . . . 32 ...... 28 Evaluation . . . . . Usability . . . of . 26 . . . . Limits ...... Evaluation . . . Usability . . . . . 24 . Automating . 3.3.6 Evaluation Toward ...... Usability . . . . Automating ...... for . . . . Need . . . . 3.3.5 . . The ...... 21 . . . Evaluation . . . . Usability ...... of . . . . 3.3.4 Methods ...... Tests ...... Usability . . . of . . . . . 3.3.3 . . Benefits 18 . . . 17 ...... Evaluation . . . . . Usability . . . . 3.3.2 Defining ...... 3.3.1 ...... Automation . . . . Test ...... GUI . . . . of ...... Limitations 16 . . . . . GUIs . . . . Testing . . . . Automating . . . . . of . . . . 3.2.5 Methods . . . . . Testing . . . . GUI . . . . . Automating . . . . of . . . 3.2.4 . . Benefits ...... Testing . . . GUI . . . . . of . . . 3.2.3 Aim ...... Testing . . . . GUI . . . of . . 3.2.2 . . Definition ...... 3.2.1 ...... Automation . . Test ...... Levels . 3.1.7 . Testing ...... Types . 3.1.6 Testing ...... Methods . 3.1.5 Testing ...... Techniques 3.1.4 Testing . . . . Process . 3.1.3 Testing Failures and Faults 3.1.2 Errors, 3.1.1 .. sblt rtra...... 13 11 ...... 10 ...... 10 ...... 8 . 6 ...... Criteria . . . Usability ...... Metrics . . 2.2.4 Usability . . . Engineering . Usability . . of . 2.2.3 Definition . . . . Usability . of . 2.2.2 Definition . . . . . 2.2.1 Interfaces . User Graphical . of Components Design Interaction and Interface 2.1.2 User 2.1.1 v 16 1 Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. pedx diinlUTFo igaso uoae ersi vlain 125 evaluations heuristic automated of Diagrams Flow UFT Additional Appendix: A Bibliography Conclusion 6 GUI with Inspection Usability Heuristic-based Automating for Concept of Proof 5 eeecs...... 113 . . . . . 121 ...... References Web . Tool-Related . . References . vlaino h eut 104 ...... 90 ...... 71 . . . . 71 ...... Results . the . . of . Evaluation . . . . 70 . . . 5.4 ...... 66 . Tool Testing . . a . in Evaluation . . . Heuristic . of . . . Development ...... 5.3 ...... Heuristics . . . adequate Structuring . . . and Deriving . . Addressed Problem . . the of 5.2 . . Summary . . . . 5.1 . . . . Sequencing Event ...... Potential Sequencing . Event GUI . . 4.3 . . . . Heuristics of Evolution 4.2 .. nlsso uoainFaiiiyi ersi vlain...... 107 . 105 . 90 ...... 86 ...... Evaluation . Heuristic . . in Process Feasibility . . Filtering Automation Heuristics of . . the Analysis of . . outcome the 91 . . of 5.4.2 Assessment . Tool . Automation Test . . GUI a . . 83 5.4.1 in Evaluation . . Heuristics Usability . . of Instances 72 . . . Environment . . Development Needed . . 5.3.2 Preparing . . . 73 . . 5.3.1 Testing . . Manual for . Suitable . Sequencing Heuristics . without Listing . Automation . 68 for Fitting Sequencing . Event Heuristics 5.2.4 by Listing . Automation with . . Compatible 67 Heuristics . 5.2.3 Listing Applications Windows . for Heuristics . Usability . 5.2.2 Filtering . . . . . 5.2.1 66 . 64 ...... Heuristics Modern . Versus . Traditional . Heuristics . Usability . of . Roots 4.2.2 Ergonomic ...... 4.2.1 Tools . Usability Versus Tools . Testing GUI Tools Evaluation Usability of 4.1.5 Review 4.1.4 vi 113 111 71 Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. . F o iga o udln ./ Feil euneCnrl 126 127 ...... 126 . . . . . 125 . . . Entry" Keyed . . by . Selection . . "Menu . 3.1.3/7 . . guideline . for Control" . diagram Sequence . flow "Flexible . UFT 3.0/1 . guideline for Messages" . diagram Error A.4 flow "Non-Disruptive . UFT 1.7/3 Fields" guideline Data for 105 to Tabbing diagram A.3 flow "Explicit UFT 1.4/15 guideline . for diagram . A.2 flow UFT . 105 derived . A.1 a . on . feasibility . automation 103 application test . MyFlight of the analysis . in . after order . results booking . classification deleted . be Overall . to a Interrupts" 5.17 . of from message "Protection . Confirmation 6.0/5 guideline Error" Following for 5.16 Placement diagram flow "Cursor 101 101 UFT 4.3/13 100 con- guideline Internet for 5.15 98 diagram without . . flow . up UFT . . started . when 5.14 . Photon application . . Eclipse . desktop of . . Netflix window . "Search" the . . the . of in . . Screenshot button "Search" Offered" . . the 5.13 Options Available of . . availability "Only The 3.2/10 . . guideline Lockout" for 5.12 . Control diagram flow "Indicating middle, . UFT 3.0/20 the guideline in . for 5.11 "enabled" diagram left, . flow the UFT on . ("disabled" button 5.10 . "Order" the . of states 94 . three The . prompted . (IDE), Environment 5.9 . Development . Integrated Eclipse . . the of perspective window . Java help custom The . Eclipse 93 an . of instance 5.8 . An expanded an . with . Windows" (left), . "User-Specified 5.7 . Steam 2.7.5/3 of . guideline . windows for . login diagram 95 . the flow . UFT in . order . numeric . Search" tabbing . 5.6 in . the Equivalent of . Case . view Lower side . and . by "Upper . Side 1.3/10 . guideline for diagram Response" flow "Fast 5.5 UFT application 1.0/4 MyFlight guideline in for case diagram flow test 5.4 UFT general of a view for expanded diagram an flow 70 with 5.3 UFT Photon Eclipse in case test 5.2 general a website for Ebay diagram the flow from UFT retrieved results search under 5.1 area attached pagination with a of delivered Screenshot "UserTesting" from test 4.1 sample a user in 12 a evaluators of of number snapshot the Partial to . relation . in investment 3.4 . on return . the . showing evalua- Curve heuristic . a in . of evaluators 3.3 number by . discovered the problems . with usability relation . of Reperesentation in . issues . usability 3.2 found . of . percentage showing Curve [34]) (see lifecycle engineering 3.1 usability the of stages The the of 2.1 window settings configuration the in pane tabbed the of snapshots Comparative 1.1 Figures of List e faeut ersis...... 108 . . . 102 ...... 99 ...... heuristics . adequate . of . set 98 ...... 99 . . . . . 97 ...... nection . . . . 92 ...... right) . . . the . on . . . "invisible" and ...... Control . . . Sequence Flexible . . . 3.0/1 guideline . . . testing while ...... actions nested . . its 47 of . view . 52 . . 46 . . . . (right) . . . application . MyFlight . . . and . . . 43 ...... actions . . . nested . its ...... [109]) . . . (see observations . . . and notes . relevent ...... [97]) . . (see evaluation . heuristic . . . . . [97]) . (see study . case tion . . . . [85]) 3 (see participants . . . versions localised (right) German and (left) English the under emulator Dolphin vii Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. . F o iga o udln ./ Cne pin 127 ...... 128 . 128 ...... Actions" Destructive . . of Confirmation . . "User 6.0/18 Actions" . guideline Control Reverse for . to diagram flow "UNDO UFT . 3.5/10 guideline for Option" diagram A.7 flow "Cancel UFT 3.3/3 guideline for diagram A.6 flow UFT A.5 viii Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. . sblt udlnsslce o rcia xeiet 91 106 . . 65 . 108 ...... 62 ...... area . . . functional . by . . . grouped . automation evaluation . . area heuristic . interaction of . . functional Feasibility . by . . grouped . data . selection . 5.3 experiments Guidelines . practical . for . selected . guidelines . 5.2 Usability . 8 evaluation . usability for 5.1 tools active testing of GUI list automated . Comparative for tools active [24]) of list 4.2 (see Comparative 9241-12 ISO by recommended 4.1 as presentation information of descrip- Attributes corresponding their and 9241-110 2.2 ISO from principles dialogue Ergonomic 2.1 Tables of List in(e 2] 7 ...... [23]) (see tion ix Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. ito Listings of List . orecd ihteGIeetsqec eiyn aapoeto rma inter- an from protection data verifying the sequence 94 event in GUI navigation the with tabbing code checking Source . sequence event . GUI 5.3 [1] the responsiveness with login verifying Snippet sequence event Code GUI the with Snippet 5.2 Code 5.1 utn cin[]...... 104 . . 96 ...... [1] . action rupting . . . [1] window login x Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. SUS SUPR-Q SUMI QUIS NNG IxD ISTQB ISO IDE IBAN HPE GUI GOMS ETIT ESD EPL CSS CLI CFM CDM CAMPUS ASQ AMME AI ACTA Abbreviations of List rica Intelligence Artificial neato Design Interaction nentoa raiainfrStandardization for Organization International nertdDvlpetEnvironment Development Integrated Interface Line Command ytmUaiiyScale Usability System Sheet Style Cascading rpia srInterface User Graphical lcrncSse Division System Electronic License Public Eclipse elt akr Enterprise Packard Hewlett Questionnaire Scenario After ile omnGroup Norman Nielsen ontv ucinModel Function Cognitive rtclDcso Method Decision Critical xenlItra akMapping Task Internal External usinar o srItrcinSatisfaction Interaction User For Questionnaire nentoa akAcutNumber Account Bank International otaeUaiiyMaueetInventory Measurement Usability Software ple ontv akAnalysis Task Cognitive Applied ol,Oeaos ehd,adSelections and Methods, Operators, Goals, nentoa otaeTsigQaictosBoard Qualifications Testing Software International uoai etlMdlEvaluator Model Mental Automatic tnadzdUe xeinePretl akQuestionnaire Rank Percentile Experience User Standardized ontv-fetv oe fPrevdUe Satisfaction User Perceived of Model Cognitive-Affective xi Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. WIMP UX UME UI UFT TKS srInterface User srExperience User nfidFntoa Testing Functional Unified akKoldeStructures Task-Knowledge sblt antd Estimation Magnitude Usability idw,Ios eu,adPointers and Menus, Icons, Windows, xii Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin1/128 / 1 Inspection Usability Heuristic-based of Automation mainly still evaluation. are the inspections find in heuristic can users However, end that requiring method interaction, [12]. not inspection despite problems cheap manually usability usability and performed and minor a user perform to and to to of referred major Contrary easy observation are many an the which [11]. considered through problems user is principles, evaluation identified usability a usability heuristic its are inspects broad identify problems evaluators fairly to where of of order method set set in a small heuristics the where a as method to engineering respect Jakob by usability with defined interface discount evaluation, a heuristic is as approach relevant.Nielsen, evaluation to less usability being limited resource-friendly data generally more collected is its A analytical as and an well as using applications, as applications tools challenging, mobile windows analytical more on and is such behaviour tool web from user Analytics in observing less Google Therefore, benefit than instance, reporting. for programs frequent error as, desktop less such data hand, is sample them other test providing feedback the a by user and On testers lab assist [10]. a can of [9] tools lack reports some a applications, for compensate web may and can that mobile (which of environment case lab the suitable In a re- and results size), sufficient test [8]. a expensive) usability usability quite (of accurate be the sample Acquiring Moreover, user aspects. cooperating interaction. a manual the quire by certain of through dominated elegance GUI is the and process of efficiency evaluation and quality the the users boost enhancing between Generally, that for interaction 3.3). suggestions improvements the by section evaluating followed in and it’s more then observing detailed interfaces, around of effectiveness, is revolves practice (and with usability the application goals is the software evaluation specified testing Usability a achieve of [7]. to use” usability users of the context specified assessing specified by us- a extent used is in “the satisfaction application be as and an can efficiency usability of product defines GUI a (ISO) the Standardization which enhance for to greatly Organization can International that attribute The quality ability. one hand, other Ranorex the [4], On appli- applications Android web mobile of for case Library Support in tools, Testing [6]). Selenium testing Android cations as automation the (such GUI and effort many [5], applications of and Windows creation time for the of to lot plat- led a desktop This and testers problem. mobile saving this web, to on changes GUI response increased through manual in automation goes forms GUI software a for frequently program, demand how the complex Considering functionali- development, application’s more [3]. during the updates a testing software when For major especially after issues [2]. ties serious to operations lead WordPad GUI could Microsoft approach possible as testing very 324 such is application than applications simple big more a of even has testing instance, for manual For need a contrast, resource-expensive. the have In and complex, today growing. slow more applications always getting Since is and process larger 3.2). said becoming section automating are by various In- which details User GUIs and in Graphical of goals, covered is proportion is diverse forms significant it with these (and of exist testing One testing (GUI) development. in of especially, terface of application, stages forms different an Many for of suitable quality requirements. techniques, high changing the of ensuring for world necessary a task, a is testing Software Statement Problem 1.1 Introduction 1 Introduction 1. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin2/128 / 2 Inspection particular, Usability in usability Heuristic-based gained, of automating is Automation of benefit automation advantages additional test one the GUI However, while manual 3.3.4. evaluation. 3.2.3, of section usability section cost in and in the covered testing exceed details are evaluation not GUI in does for discussed tests goes for are automated needed same benefits the resources the and maintaining Naturally, cost, of time, cost tests. the the reduces long greatly as process testing, testing a automating general, In Motivation 1.2 affirm to order in practice, and in selected analysis. sequencing event the and be with of tool, would violations correctness automation environment) guideline the test some desktop GUI detect Windows a to a Afterwards, used for then does. (suited similar tool a automation applications in test software events GUI GUI GUI some a some requiring sequencing way without from the automated benefit to be can manner can set that those that groups: clear require finally those three and and manually, into them sequencing, checked relevant, them event of be re-categorize minimalistic, only all and a should not them, that all select examine were, Those not to applications, they Windows needed that for if it’s is heuristics even a Therefore, however, of and provide problem, events. may automation, The GUI aspect for sequencing manual suitable field. traditionally are engineering this heuristics usability Automating usability the in sequencing. improvement without event and stable notable GUI with of analysed list of a be would gather use to feasibility the simpler automation becomes whose enclosing it heuristics, By applications, usability Windows general performed. only and is cover evaluation to use heuristic as of way such context the techniques the automation impact test positively GUI may that sequencing assume to event reasonable appears it considered, supports points question All in tool the as long tools, as testing testing. [13] capabilities the automation GUI sequencing event-driven GUI [15] in script-based have for supported these [18] area of not an lot [17] is considered a research events traditionally and academic is GUI for events of created GUI for those series Sequencing or a cannot, [16]. market of the they on accomplishment but currently possible the tools of apart, requiring usability sequence far task aligned the any too not if users. fact, are or expert are In and completion, buttons fields beginners its text two for after different if if undone be tell be can or might actions can small tools user functionality too usability a some is if test instance, used instance, For font the [16]. if [13] properly, [17] problems on function-oriented [15] than focused so method more inspection [18] issues the usability 4.1.4). automating navigational section at or in attempts aesthetic explained research identifying is previous on (as field, focus sample academic mainly user the rather observed but In methods, the the inspection from on automating data currently further usability-relevant on tools capturing focus usability to methods, seem engineering not do usability market other of circumstances inspection the the Unlike of the part automatically, single checked every such reduced. conducting are the greatly for overcome guidelines to experts be would enough usability would stability of Automation once more presence Furthermore, bring the [14]. on and [16]. dependence use element, of human [15] context the process guidelines in wrong testing usability limitations the of by in misinterpretation introduced them by inconsistencies caused applying be to Additionally, can due which update. or detected, software inconsistencies issues to major lead usability each might the after evaluators in different loss by usability performed automa- of inspection the risk heuristic-based Furthermore, manual the [13]. reduce 2.2.2) section help in would shortening more tion to discussed contribute are and (which costs, re- cycles testing needed design and the iterative designing and lower extension, time evaluation by would the which reduce sources, would process inspection the automating Therefore, Introduction 1. Chapter ..Motivation 1.2. Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin3/128 / 3 Inspection Usability Heuristic-based of Automation in interest increase to attempt an irreplaceable. is be assessments. to usability options, po- faster users automation and the end usability cheaper consider about new they enthusiastic providing being because and not evaluation are Analysing usability generally inspections in dominated are heuristic-based automation is engineers of market while usability tential the the evaluations, some that, in user-oriented Moreover, Despite methods in evaluation usability. marginalised. assisting most testing of for for basis way tools only the by the is evaluation not it’s usability However, for to sample field. practice user best a [19]. a on usability it’s Relying on Nielsen budget Jakob project’s to a go according of would 10% that cost same around development considering the overall spend when assessing The each especially for eliminated). creating well, needed not after as time (if inspection down the shortened usability be and would automated reduced, on greatly an later be run heuristics would (described to cost engineering possible design usability were The of heuristic-based it stage prototype. If of important an automation 2.2.2). The is section The prototyping. which in GUI GUI. design, during iterative completed particular, for in a and applies stages, of same development usability earlier in the assist can assess inspection helping of top On 1.1: Figure tabbed the in this tabs In version). the localised all the if situations. matter checking such (no by visible instance, from are for automated emerging pane this, implementing solve issues the would Therefore, usability automation require case, detect used. would particular help language and would each cost, usability allowing in inspections testing separate by fluent usability the performing experts yet, could However, increases usability simpler it greatly of more. or shown versions detected, presence little window, the localised been a German each Furthermore, has window the on configuration problem of evaluations buttons. the the width small resize If the and two to increasing seen, like users users. be by looks to solved by almost scrolling been resized it horizontal have be requires emerged. that It cannot short problem window pane. so usability configuration tapped is Because a the bar counterparts, green). on scroll visible in English the not circled their is area than respective tab does longer same "path" that the The often version The has German are (which the right). words on version the issue English German on usability the German a on and to exist attention left "Dolphin bringing not the software circle on same red (English a the versions includes shows of figure localised 1.1 major window figure two configuration a instance, under the after For but on usability Emulator", versions. present localised component of the different pane degree has its tabbed software all some a the across ensures lost that usability technique had of no testing degree or there’s application tool same However, an no also loss. if there’s Moreover, functionality checks update. detect software that to from existence possible engineering in it’s usability tools, tool to testing brought GUI a with be as Currently, further. can sequencing 4.1.2). efforts that event automation section push GUI aspects in to share more detailed practices to software researching are other able usability to (which be a differences lead would many of might fields automation their This two the despite the in technique, that assisting testing say for common to used is successfully That is technique inspection. testing GUI a when Introduction 1. Chapter opi mltrudrteEgih(et n emn(ih)lclsdversions localised (right) German and the (left) of window English settings the configuration under the emulator in Dolphin pane tabbed the of snapshots Comparative ..Motivation 1.2. Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin4/128 / 4 Inspection Usability Heuristic-based of Automation the in described briefly are which one), this (including chapters six following: of compromised is thesis This work the of Structure 1.4 decrease to how comprehending it’s rather study but the work, of manual possible. purpose all as the replacing much Therefore, for in as way it irregularity inspection. a the an automated find over of to to responsible au- kind due not solely any an issue is is of even obvious judgement results own because an evaluator’s the expert, detect usability of to human the relevance fail end, the the can replace the At or totally application. assisting mistake, to the a to not make limited is can study is tool this inspection tomation of heuristic-based aim automating The to evaluator. event related of efforts practicality all the However, experiment, documenting, an By of tool. part challenged. said as is of sequencing using heuristics automation tested some the evaluating be then and would evaluated, implementing, heuristics be partially would usability environment be gathered Windows to previously a applications some under software sequencing event some of alongside capable selected, Afterwards, tool the would automation. automation heuristics of with test compatibility Similar feasibility GUI their de- sequencing. a to the for according event argue categories heuristics GUI different of then without into gathered set and and be with complete applications, inspection a their desktop examine of of Windows would rules automation thesis on broad the interfaces are of user part use. guidelines signing of theoretical of context the sorts the end, on These from that depending recovery To difficulty use. in status, vary of may system efficiency automation tests. of and their manual and visibility flexibility needed thumb of as the sum well the checking the as on cover reducing sequencing errors, by heuristics event evaluators usability GUI usability on of instance, The burden effect For the evaluation. the heuristic lessens of specifically, to research more is and and goal methods, analysis usability the inspection-type is of thesis automation this of aim The work the of Aim 1.3 Introduction 1. Chapter • • • • oe yadsuso fteoiiso ersis o hyaeddcd n o hyget they how and presented. deduced, is fol- thesis are is this they It in how questioned heuristics, tools. hypothesis of respective the origins Finally, their the updated. of of some discussion reviews a and by In- evaluation, lowed between Usability relation usability Heuristic-based the and details "Automating in testing examines as chapter GUI to This referred Sequencing". Event is GUI with section spection fourth the The provides 4: It Chapter GUI testing Testing". and Software general, in of particular. software in "Fundamentals testing usability in titled and concepts is key understanding part for third groundwork The as 3: designed, Chapter is are. Usability GUI in how concepts comprehend basic Usability to the and needed what Interfaces essentials as User well the Graphical of covers "Basics it called and is Engineering", chapter second The 2: Chapter and proposed thesis is discussed. solution this are the a benefits of including Then, its topic evaluation evaluation. usability general heuristic of automating the situation toward describes efforts current it insufficient the on introduction, background the some is provides section and first The 1: Chapter ..Amo h work the of Aim 1.3. Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin5/128 / 5 Inspection Usability Heuristic-based of Automation Introduction 1. Chapter • • h esn ere.As,i rpsssgetoso o ofrhrrsac h ujc of and subject study, the the research further of to results how the on summarises inspections. suggestions usability it proposes automating it conclusion, Also, the learned. is lessons section the final The 6: Chapter chapter. fourth according the task validated this in is for gathered classification suited information best their tool the Then, the to in automation. sample, heuristic their representative of a automating feasibility by the according classes Heuristic-based and different of focus, into Automating group them their for categorises relevant to and a Concept set, derives heuristic of section sizeable a This "Proof from Sequencing". guidelines titled Event is GUI with chapter Inspection fifth Usability The 5: Chapter ..Srcueo h work the of Structure 1.4. Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin6/128 / 6 and users Inspection between Usability happen Heuristic-based that of dialogues design Automation complex UX the defining of on subset method- is and a focus practice its theory, is design, in IxD based UI is on angle, IxD ology this though Even From systems. software science interactive disciplines. research, examining cognitive user other (IxD), a and Design is Interaction architecture, design, UX This information UI practice. physically of non-digital can commercials). characteristics a essential user being as incorporates the theoretically (such design despite what companies hear include digital also or by but predominantly packaging), be- GUI defined product and the perspective at as stop user user (such not influencing affect touch do of that elements intention elements A These the the design. [21]. with of (UX) havior company, synchronisation of Experience particular and User quality a creation called the with the is improving experience company be of a would process of definition facets the broad all more background, develop- and responsiveness this with user proper From a UI adding between the and sizes. interaction interactivity, implementing screen correct for device ensuring responsible different prototyping, only factors to UI not humans for of is also account designer but taking ers, UI properly soft- A by make satisfaction to ergonomics. and user order effectiveness and software maximizing in improving as requirements, at well aims their process as and This efficiency, interactive needs useful. via and their GUI usable a users, efficiently through most on user ware focuses the guiding design visually UI of process elements. the is design (UI) Interface improves he’s User which program each elements, for graphical commands of additional text learn set to on consistent user using. relies a the stylus, requiring only uses not or which GUI by (CLI), mouse keyboard, learnability software Interface a a Line as by Command icons, such accessed Unlike device windows, is pointing and as [20]. a such touchscreen through a indicators manipulated human- through visual usually facilitates or and that are interfaces graphics These user of menus. of solving use and class with the a involved is through process GUI cognitive interaction his A the computer for [20]. in feedback activities him simultaneous conducting assisting him or thus, for problems side, displaying machine and the happens, side, from machine human and actions the human between to interaction control where effective software of giving portion the is interface user A Design Interaction and Interface User 2.1.1 subject. view the narrow with associated a vocabulary get technical to the order examine In to important studies. it’s such behavioural interaction, fields this and multiple on science, encompass interaction computer and of broad, multimedia, type be This design, can with. as machine interacted and be human also between can communication but used, or be only not can machines Nowadays, Interfaces User Graphical 2.1 with. work to users for why easy and is interaction it this so of it basics design the to comprehend preferred to is important it is with interacting It of created. ways been different have evolving, computers technology with and increasing demand software With Interfaces User Graphical Engineering of Usability Basics and 2 Usability and Basics GUI 2. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin7/128 / 7 Inspection Usability Heuristic-based pre- of implies Automation consistency principle Design primary consistency. The maintaining [25]. is guidance design user UI of of context aspect the information, this in status governing help feedback, on-line prompts, and use 9241-13 ISO best management, interaction. to error user how guiding describes in and indirectly recommendations, and of directly provides aspect contribute to static have the design represent interface). UI expecta- and the Good of 2.2, user "look" table with the in conformity called described of (also and GUI principle listed a are the attributes to These dialogue especially [20]. the and and of tions are application oranization 9241-110, data the ISO information to of contribute from both color 9241-12 principles the ISO contain from and presentation attributes size, presentation Information shape, The and The coding. labels, coding. its organization. information grouped, information of aligned, of arranged, part recom- forms is attributes all information presentation are way the location, The display its [24]. should 9241-12 GUI ISO a by in as mended coded the such and considering aspects organized without consider information not The terms does general and design. in technology, corporate presented and or prin- aesthetics is environment of marketing, set 2.1, application, This table use, [23]. of in systems situation information described and and which humans requirements listed between ergonomic dialogues ciples, the of specific forth design sets a the 9241-110 to under ISO apply issue the particular background, a particular the solves Given it contextual, [22]. resources nature available its the by using context is IxD systems. interactive 2.1: Table Interfaces User Graphical 2.1. Usability and Basics GUI 2. Chapter ro tolerance Error Controllability descriptiveness Self pectations ex- user with Conformity isation individual- for Suitability learning for Suitability task the for Suitability Principle Design rooi ilgepicpe rmIO94-1 n hi orsodn description [23]) corresponding their (see and 9241-110 ISO from principles dialogue Ergonomic ni h on hr h eahee i goal. his achieves he the where interaction point whole the the until across direction user maintain the can when controllable sufficiently queries. is dialogue user A to response in the or system, from imme- information feedback and through completely comprehensible is steps diately contained every- its that of point one the to self-describing is the dialogue A with experience be- user’s and task. the conventions to common according to haves degree conforms the to it expectation that user skills. matches or needs dialogue user’s A the suit task, best certain to a extent order for in the modification own to its individualisation allows it for that fit is dialogue it A that tasks phase. degree his learning the performing the in during to user the learning guides for and supports fit is dialogue A it completing that in tasks. degree efficiently the and to effectively task users a assist for suitable is dialogue A Description pt neigeieternosinput. de- erroneous action, evident be corrective entering spite no might to task minimal the a with that achieved, performing point the from to expected errors results of tolerant is dialogue A Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8/128 / 8 Inspection Usability Heuristic-based of Automation win- contains and interface, the of [27]: tabs look and or controls appearance icons, interaction entire menus, dows, as the define described elements are GUI operation Structural interaction repre- com- ongoing components while an Therefore, elements, of [27]. structural elements state represent. called they the are what conventions representing applying on ponents by focus information to generic is method senting One elements representation. said information classifying consistent offering of elements visual are components GUI Interfaces User Graphical of Components 2.1.2 in result should user the by made action computer. Every the from [26]. feedback inputs noticeable user a to response system of dictability Interfaces User Graphical 2.1. Usability and Basics GUI 2. Chapter al 2.2: Table • • • • etutrn U hl anann ossec hogotteetr nomto sys- information by entire easier the interaction throughout user-computer consistency interacting specific maintaining tem. a when while make information GUI can of a widget restructuring manipulation Each direct application. allow an These with widgets): enhance improve (or can to Controls they tasks programs, similar as run with such to familiarity objects way user digital of guidance. quick for user advantage a stands take are and that They learnability, representation software windows. graphical or small files, a is commands, a icon through An accessed Icons: mainly are options shortcuts. offered the keyboard The of and representation device pointing simplified capabilities. a and present commands. they executable features selectable because software’s of practical list and a convenient between choice are the Menus user in the gives change menu number A around, the Menus: limit moved can be memory simultaneously. can system opened the be window area Only can Each each that program. windows different areas, a of screen. run different the and into of size, from and divided rest independently shape displayed the be be or can can windows content screen other whose other The window, encompass a or in content screen. information some representing present the could on that elements area display GUI a is window A Windows: Comprehensibility Legibility Detectability Conciseness Discriminability Clarity Attributes Presentation trbtso nomto rsnaina eomne yIO94-2(e [24]) (see 9241-12 ISO by recommended as presentation information of Attributes nesadbe eonzbe n unambiguous. clearly and is recognizable, information understandable, presented the of meaning ease. The with read be can content in- The the towards directed seeks. he is formation user the of attention The inap- or information. irrelevant plicable with overburdened not are Users user distin- accurately guished. the be can information to displayed The relayed is accurately. and quickly content information The Description Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. omfiln ilgecmoet r h ot ru,adterrcmedtosaedsrbdin described Finally, are recommendations [31]. their and 9241-16 group, ISO [32]. forth in 9241-17 the have, presented are group components are third dialogue they The filling and form [30]. 9241-15 components, compo- ISO dialogue dialogue in manipulation command described are direct some usage how Subsequently their of on [29]. basis in Recommendations practices described the nents. best are on which their groups components, alongside four dialogue 9241-14 menu into ISO are classified First, be recommendations: might their divided GUI ISO a in components key Additionally, [28]: classes four following the of one more in adds be method can This which functions. element, multiple the an has of words, it goal other if the In group to GUI. one clarity the than in more to function belong their might on element focuses components categorizing of way Another and cursors, pointers, includes mainly some category be This or intent. with, user’s interact [27]: can the selections user of the reflection that GUI visual the of of form parts the show either can elements Interaction uoaino ersi-ae sblt npcin9/128 / 9 Inspection Usability Heuristic-based of Automation Interfaces User Graphical 2.1. Usability and Basics GUI 2. Chapter • • • • • • • • lmn,o oecnb hwn hi otn tatm,dpnigo hi configuration. their one on element, depending No time, a within. at content content is the their item show showing an to be When expands can functionalities. more it hide or under or element, section show GUI either accordion the that An items on, accordion. of clicked the list is stacked component vertically container a a in is of them instance presents An and fashion. elements organized other an gathers component compo- container contains A group components: boxes. This Container message interaction. and icons, the bars, of the progress aspect as to cognitive such the information nents in useful him or assist feedback that provide user, tags. elements and These pagination, a sliders, components: in fields, Informational navigation search as of such ways elements possible contains the group This define GUI. fields. components date These and fields, components: file Navigational fields, text radio toggles, checkboxes, buttons, as boxes, such list elements to lists, contains control dropdown group giving buttons, This on interaction. based the functions throughout various user doing the enable These components: control Input element one than selection. more hand multiple Selecting by called device, is shortcuts. time pointing keyboard created a a through at be through or usually may device), selected or touchsreen be a system, can place. (on information take Elements can the user. operations by the user automatically by which on manually created items be more may or one selection of The list or a point is insertion selection an A called Selection: also is cursor text A controlled keyboard. is the hand, other as caret. the such component on device pointer cursor, input the the text mouse, While a a as input. by on such to interaction device respond pointing user a would for by GUI position controlled the is current where the show or to screen, used display indicator the position a is This Cursor: actions initiates movement, drag-and-drop and a manner double-click, locates touching. a a through pointer click, in or a The moves with as that device. such screen manipulation pointing direct display the through the of on movements appearing the symbol a reflecting is element This Pointer: idw asfcltt h aiainbtenst fdcmns n d ossec to consistency add and documents, single of a sets presentation. within information between represented the navigation be the to facilitate documents Tabs or panels window. multiple allow elements These Tabs: Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin1 128 / 10 Inspection Usability Heuristic-based not of if Automation include Even to user-friendly. is possibility system the resulting offers the that lifecylce ensure This to How- process [33]. development stages. practice the supplementary in in multiple usability followed happen have always recognized should that was not engineering It was activities usability event. this of that ever, one-time field, set a this not a in is includes on and practice early development, system rather This or it, product improve process. us- throughout to design ideally words, ways the suggesting other engaging and In usability directly interfaces. assessing than order elegant on in and more analysed efficient focuses and studied engineering designing is ability for UIs methods of friendliness structured user produce which to in field the is engineering Usability Engineering Usability of Definition 2.2.2 associ- traditionally and [8]: process, components design quality the key during multiple referring with ease-of-use also ated that ease-of-use, UI for assessing methods attribute enhancement quality a to Nielsen, Jakob to according degree. is, is usability Usability it user-friendly high functionalities, term a the and with Finally, features systems usability. not describe its its does to increases in used Usability improvement application an an users. signify the intended if systematically to its not system, refers would of a perspective satisfaction of lastly, the functionality and from the tasks, use assess of intended ease their application’s or complete or product to system users the measures efficiency for whether development, effort during of anticipated required question as the work the their perform answers to effectiveness users definition, enables specified context system by the specified used a Examining be in can satisfaction product use". and efficiency a of effectiveness, which with to goals extent specified "The achieve as to [7], users 9241-11 ISO in defined is Usability Usability of Definition usability Engineering 2.2.1 what Usability understand 2.2. to important it’s engineering. usability users, in for concepts design order basic in optimal the Therefore, and the GUI. with means, achieve a experiences of to and success their function, how the like its in learn also difference with to big form will a GUI’s users make a might the balancing guidelines interfaces that properly these mean the following for exist not create guidelines does to Many that freedom interfaces. but these the most, themselves the allow like might they developers that GUI and designers GUI Engineering Usability 2.2 Usability and Basics GUI 2. Chapter • • • • • h ereta sr iei o tlatdntgtfutae rmuigit). using from frustrated get don’t least to at experience (or pleasant it a like users be subjectively that degree should the the software for the easy Using is satisfaction: it Subjective encountered, are errors catas- severe no them. that less from so when recover low, to that be system should so software and the occur, of errors rate them. trophic error relearning The time errors: wasting non-severe first and without Scarce again became tasks who the user, do a can that while, so easy, a should for system inactive the use to more how be Remembering to Memorability: user the allowing system. manner the a use in to efficient, learning be after to productive has software can The users UI. use: that of the degree Efficiency encountering the time first to their functionalities, in system tasks learn basic to perform easy rapidly be should It Learnability: Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin1 128 / 11 Inspection Usability Heuristic-based of Automation efficiency of effectiveness, way are One which usability evaluator. defining satisfaction. factors the key user on the and depending on focusing differ by certain might is a usability it measuring that assessing proving for for methods another. useful actual to be The compared also value customer- usability can low higher metrics how a Usability on has design based year. interface lead-developers, that user-friendly for during not bonuses was giving is cost usability as support or executives such their company plans, is, higher-level of bonus assist system track making may assess a in usability keep consistently measuring whether also Additionally, to decide release. can company for or Companies enough a releases, for between others. possible progress to it’s improvement compared measurement, position manner. of reliable competitive and method consistent its a same in the usability evaluating using or measuring By of way a is metric usability A Metrics Usability 2.2.3 Engineering Usability 2.2. [34]: below described briefly are successful and wield 2.1, still figure can in efforts represented engineering are usability stages performed, These are [8]. lifecycle results the in steps the all Usability and Basics GUI 2. Chapter • • • • • • • edakcnfr h ai o e nlss(n hssattecceanew). cycle the collected start The thus (and studies. analysis new market logging in- a special from for more or come basis users, the can complaints), form of data client can number This as feedback high (such gathered. a data be to secondary can exposed files, usability and system’s the released about is formation system the resolved. users. as After real deemed on is Feedback: relying problem tests usability additional the until from continues inter- benefit process the can iterative persists, iterations This issue Major the If thoughtfully redesigned. heuristics. and is usability experts, face with usability conformity new with its a discussed checking testing, then by empirical is analysed the design by That identified designed. problems, is usability interface the causes. on their Based and This severities, design: their methods. Iterative evaluation issues, usability usability of analyse combinations and any identify or to one aims experiments carrying stage conducting by by tested or and users, examined real is on usability phase, this how In examine testing: who Empirical experts, by interface. done designed is the more evaluation with or The one interact evaluate system. users and the real explore of to usability used the is of it aspects prototype, a building After scenario-based evaluation: called Expert is scenario case use certain a match to reducing prototyping. depth Finally, the prototyping. prototyp- and horizontal vertical quantity called called the is is both quantity depth feature functionality the decreasing Decreasing designs, while effort. actual ing, and implementing cost of time, evaluation instead on early prototypes up the saves Creating allows solutions. that version considered software some incomplete of an focus is to prototype important A it’s Prototyping: approach, the matter representing No model designing. developers. the when the consistency and by on tasks used is user design which actual the tasks, between in the mismatch users potential the involves avoid fine-tuned which to and design process developed participatory further is be would de- approach different that Another exploring one iteratively. at best aims the which choosing design, before parallel alternative is sign this doing for setting. approach goal One analysis, concrete Design: functional in analysis, result task always should analysis, It user analysis. include competitive may and stage analysis The Analysis: Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. iiy iebsdefiinyi n a fmaueetcluae yteeuto hw be- shown ac- equation each the for by needed calculated was measurement time of much way [36]: how [35] one low note is and efficiency tasks Time-based performing tivity. users observe would uator important provides task Efficiency: per errors of number classify and average error, system. The encountered the about each category. data to diagnostic corresponding rating severity a a under evaluator assign The it usually experiments. would testing counting usability by user in achieved the participating be users can observing by This made rate. mistakes error of the number calculating the by measured be also can Effectiveness N S where: uoaino ersi-ae sblt npcin1 128 / 12 Inspection Usability Heuristic-based of Automation Engineering Usability 2.2. Usability and Basics GUI 2. Chapter n N U where: hsi n ftefnaetluaiiymtis tssml n eaieyes oprom The [36]: perform. [35] to equation easy following relatively the uses and and simple percentage, It’s a by metrics. represented usability is fundamental rate success the of one is this Effectiveness: osnt then not, does ij ubro ucsflycmltdtssb h srsample user the by tasks completed successfully of Number = oa ubro users of number Total = oa ubro tasks of number Total = oa ubro tasks of number Total = h ucm fts i yue j,i efiihstets ucsfly then successfully, task the finishes he if "j", user by "i" task of outcome The = iue2.1: Figure hsatiuetksmr ieadefr omaue hnefciees h eval- The effectiveness. than measured to effort and time more takes attribute This n ij tcnb esrdb acltn h ucs ae(locle opeinrate), completion called (also rate success the calculating by measured be can It 0 = h tgso h sblt niern ieyl se[34]) (see lifecycle engineering usability the of stages The iebsdEfficiency: Time-based ucs Rate: Success N S X × j =1 U N 100 X i =1 N × U n t i,j i,j n ij 1 = fhe if , Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. otettltm ae o l h ak n yalues h aibe sdi h qainfor equation the in the used in variables counterparts The their as users. concepts [36]: all [35] same by efficiency the time-based and for match tasks efficiency, equation the relative relation all in overall tasks for the the taken determining perform This time successfully efficiency. to total relative required the time overall the the to of calculating ratio through percentage performed the is task gives attribute process a this represents assessing goal of way a Another where second, per goals of value. time successfully. unit total a the has calculating efficiency for considered Time-based is quits user the moment the then uoaino ersi-ae sblt npcin1 128 / 13 Inspection Usability Heuristic-based of Automation Engineering Usability 2.2. t Usability and Basics GUI 2. Chapter udlns(locle ue ftubo ersis htcnb ple oaytp fue inter- user of type broad any ten to applied Engineering be Usability can book that his heuristics) in or [8]: achieve thumb presented faces can of Nielsen system rules Jakob a called how (also describing usability. guidelines standards of of level set high the to a refer principles) (or criteria Usability Criteria Usability 2.2.4 details in track covered keep are 3.3.3). and methods section evaluation design and usability in a qualitative expensive, assess (These is progress that improvement for metrics numbers the insight provide of presented useful metrics the provide quantitative methods Using while evaluation design Qualitative improving [39]. times scarce. According four study typically resources. to are qualitative up resources and a cost usability time performing might in measures as cost quantitative much usability the (NNG), as is Group metrics Norman usability Nielsen on the to relying of project. downside overall the main the and The for satisfaction satisfaction user the on for perspective Choosing user’s budged the company’s questions. of the ten importance on with mainly (SUS) (QUIS) depends Scale Satisfaction questionnaire Usability Interaction appropriate System User and For questions, In- Questionnaire twenty-four Measurement the with Usability questions, Software fifty usability. the with Percentile system’s questions, (SUMI) Experience ventory thirteen the User Standardized with of the (SUPR-Q) impression include Questionnaire category overall Rank this participant’s session in test the questionnaires a assessing of after Examples occurs at This aims satisfaction. level and test concludes, on focuses Magnitude measurement of Usability method the Another include and [38]. questionnaires question [37], one of questions just is type three with This this with (UME) of [36]. Estimation (ASQ) difficulty Instances Questionnaire task’s Scenario satisfaction. the questionnaires After level about post-task the task user the These of asking assessment task. questions, called five a also than completing less at of attempt consist their usually finish they after immediately Satisfaction: ij .Vsblt fsse tts h ytmsol laspoieueswt h information the with users provide always should system The status: system of Visibility 1. h ienee ofiihts i yue j.I h srgvsu ncmltn h task, the completing on up gives user the If "j". user by "i" task finish to needed time The = ocrigwa shpeigo h ytmsd n h tt fteritrcin This interaction. their of state the and side system the on happening is what concerning n a fmauigue aifcini ohv sr l naquestionnaire a in fill users have to is satisfaction user measuring of way One vrl eaieEfficiency: Relative Overall X j =1 U X j =1 U X i =1 N X i =1 N n i,j t i,j t i,j × 100 Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin1 128 / 14 Inspection Usability Heuristic-based of Automation Engineering Usability 2.2. Usability and Basics GUI 2. Chapter 0 epaddcmnain srdcmnain(rue ud)adhl o h ytmshould system the for help and guide) user (or documentation User documentation: and Help 10. .Ass sr nrcgiig igoig n eoeigfo ros h ytmshould system The errors: from recovering and diagnosing, recognizing, in users Assist 9. visually and simple be should colors and layout screen The design: minimalist and Aesthetic 8. accelerators). (or shortcuts be would this of instance An use: of efficiency and Flexibility 7. expected be not should information shown prior Memorizing recall: than rather Recognition 6. their checking or conditions error-prone eliminating by done be can This prevention: Error 5. users prevent to followed, consistently be should Conventions standards: with Consistency 4. the have possible, whenever should, mistakes make who Users freedom: and control User 3. communicate should system information The world: real the and system between Match 2. otx-estv n sroine.Tems omntssms eepandi concrete in long. explained too be should be be help must not The tasks and them. common steps, in most information The for search user-oriented. to and easy context-sensitive be also should it and complete, be a suggest and problem, the describe precisely language, plain solution. constructive in messages error express the on displayed be not should users to relevance no or dialogues. low of information Also, appealing. attend easier, can or system faster users. the expert tasks way, and This frequent beginner both performing users. of novice needs in the from user complexity expert unneeded the shielding while assist may elements These instructions find easily appropriate. should whenever user system the the making Additionally, of by use achieved visible. for be between can options tasks remember and performing to have objects, when user or actions, the task, what a Minimizing on intervals. working time progressively long are they while users choose. of to convention date what knowing in date users a using assist instance, will For field action. certain an a committing for he’s picker confirm user the letting another before validity symbol asterisk the Giving interface. an form. of asterisk a value an usability filling the displaying is drop of user only only a practice will would meaning when common hyperlink fields the a text be like required would con- works near example such that Another Breaking button text. flashing users. blue animated confuse animated underlined an treating as building to hyperlinks by accustomed seeing ventions, are or users advertisement, words instance, an various For as whether meaning. text comprehending same not the by have actions frustrated, or getting or mistakes making from should Users it. redo to option operation. the an as terminate well to as able possible, be if always activity also their undo easily to option codes, code. confusion. error 500 the status the such spared or of be code meaning should 404 the and de- comprehend the not has a does error example, end-user status average For system the However, a avoided. logical. if be knowing and therefore appreciate should natural might sym- and veloper appear even confusion information or cause make would concepts, conventions user phrases, the Real-world words, to System-oriented unfamiliar are means. that bols familiar by user all the Additionally, with part. useful the that time. of of reasonable visibility part a within any the sent because reduces be entirety, should needed its feedback in not use is of context that the information to relevant be should information Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin1 128 / 15 Inspection Usability Heuristic-based of Automation Engineering Usability 2.2. of nature the proposed on depending possible differ of would subset principle, the same for GUI. the Therefore, standards the under the grouped different. However, are are which applications. applications of guidelines, web group web and additional and mobile an mobile standards desktop, of with desktop, for consistency composed the GUIs be instance, to For still applies use. can of principle them context the of on on one depending generalised differ each be that Additionally, can guidelines, application UIs. their of abstraction, types enough with all presented are criteria these Since Usability and Basics GUI 2. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin1 128 / 16 also Errors [43]. Inspection Usability length Heuristic-based problematic of a Automation it giving or data, datatype declaring wrong when a or the errors variable, variable on introduce referenced a depend might uninitialized assigning might an developers as using code such languages, as source the oriented such the even data object or in referencing In models, errors when the for code, used. checking the languages into affect programming might put and effort words. development, The or of symbols stage of documentation. any group in a meaning occur the between components of can relationship its misunderstanding the Errors of a behind from forward one or results propagates or word, error error a system semantic an symbol, A a when a next. of behind process the inability sequential to total a a step previous in the hand, the a happens in other from of error results the variations inherited that An On time-based one function. of is to input. independently error the whose occurs fatal mistake of A error) a aspect compilation is input. an called error) to run-time (also applied called error (also variations, static error time dynamic on A depends [42]: occurrence exist correct errors theoretically of the types or or Many state, measured, specified observed, true, the the if from result incorrect, different [41]. computation deemed is condition is output incorrect computation the an the of of introduce state computed result might The mistake error. human an called a software, developing When Failures and as Faults such Errors, terms explain and 3.1.1 define to needed of is it understanding end, an Conse- that requires failures. To specification. and also given faults behaviour incorrect. a errors, program’s program to a a respect makes of with selected on what correct correctness tested is be the to output analysing program the the quently, whether running includes checking its testing and software in of input, placed focuses when main purpose the intended of One its fulfills stakehold- product built the the by [40]. environment that specified intended demonstrate requirements to be the validation order to meets contrast, in In mostly service ers, [40]. process or service internal end-product or an the with end-product whether is the complies evaluates than It it rather whether components, condition. and system imposed on expected, is or applied as Verification requirement, behaves [40]. regulation, program requirements program specified a specified software a whether some evaluating a to of that respect process verification with the correctly and behaving related validation activities is the of service subset including a or as assurance, stated quality be and also software correctness, can the to It quality, implementations. the software examining of section, completeness integral the an is testing development, In Software in Basics and Definitions 3.1 users. of the fundamentals to significance the their comprehend and to types, important its it’s techniques, involved, team its Therefore, the testing, of software needed. size the always or on is used impact model testing varying development software a the matter with no development However, software product. in final models the different and phases different are There Testing Software of Fundamentals 3 Testing Software of Fundamentals 3. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin1 128 / 17 other the On Inspection progress. Usability Heuristic-based testing of the Automation on information status some documents finally input testing offers The and and the planning [45]. test management, describing general control test incorporates outline and which for an excution plan test project basically master the a is include describing also which formally could strategy plan project test a a and (ISTQB) input, approach, Board as Qualifications Testing have Software could International syllabus the [45]. in points presented verification explaining process the activity, testing examining each The and of used, role entry templates the the the describing defining showing process, include enforced, the input might metrics of accept process the purpose processes a the All of stating purpose. criteria, description certain exit detailed a and A fulfill to output. out tangible carried produce activities and of series a is process A Process Testing 3.1.2 failure. a definition by deviates is clearly which behaviour incorrect, system’s becomes the the therefore If then and day, requirements, omission. specified same the of the receiving the fault to from on lead a that times actions has multiple the gift, performs incentive repeatedly insufficiently one monetary user for only the made responsible where and login mistake runtime, code one in This source then occurs receiving the scenario immediately, time. for in more it eligibility function one user’s written spend coupon the The coupon, gift checking error. gift an additional is an a developer receive receive the and developer birthday, by day, the his same if However, the on user birthday. on login the his again function, to of giving incentive date able once the the developing be user edits when a might conditions user to some this offered check if be for properly even would incentive to incentive year, monetary forgets this a a that, in as is once coupon requirement gift only Further digital and A that a receive precise site. should might the birthday, requirements on his the shopping on of in one general, logs website, who In E-commerce user an a of software. defect. context critical term the the safety in by instance, in For to happens referred be that can one deterministic failures is a and A have failure faults capabilities. failure both catastrophic partial systematic with a a operating Finally, while continue manner, to unpredictable cause. system an shut-down a in complete allows the occurs to failure failure leads soft random that a one or while is failure function system, hard required a A of a [42]: failures [41]. perform of requirements to kinds the different system by are There or specified manner component the a in of failure result, inability a expected speaking, the an Generally deliver words, to fault. other referring a In event of occur. representation an might a is failure executing software of by a triggered detection fault, mainly The a are failures to corresponding persisted. code not effort. the more executing completely requires When transaction or and a challenging partially of more are representation is that a omission an present. of instance, values containing faults be for object and be, to an states might representing needs omission produces code that of that source fault representation be a would the while commission in value, of incorrect part fault two missing of a be of a and can example representation, indicate Faults An their which in faults. information omission identical incorrect one to of contain for lead which faults possible to commission, It’s errors of several software. Faults for the [44]: or within types faults, error multiple an cause of to result error manifested the certain is a fault treating a for Subsequently, memory before Basics available files and insufficient opening Definitions having not from from or 3.1. [43]. example, use, file for Moreover, after produced, modules. files are other closing They by not of occur. sent use, number can arguments Further- the errors of iterations. between output number few mismatch and the too a and input or instance module many a for too by like with received a errors, loop parameters having interface a by make or caused loop, can also infinite developers an are more, as They such datatypes. issue flow inconsistent control comparing or computing when happen Testing Software of Fundamentals 3. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin1 128 / 18 depending changed be Inspection can Usability Heuristic-based themselves of techniques upon Automation relied test be The to techniques [47]. design process test testing of variety the a throughout covers 29119-4 ISO/IEC/IEEE standard The Techniques Testing 3.1.3 re- and criteria exit evaluating con- execution, and activities. and planning closure Test test implementation finally order: test and following design, porting, the and in Basics analysis performed and Definitions are test process trol, testing 3.1. the in reports, activities progress The reports, summary test and logs, [45]. excecution, test reports environment include test experience test also test for test could the the needed and output have setup and The might hardware workflow, data. output and the test software The actual detailing the the cases. plan of test specification test and a a suites includes test of which of consist form might the process in this specification of output the hand, Testing Software of Fundamentals 3. Chapter • • • • • sd eoddadacie o udnei uuepoet.Dsieisueuns,ti task this [40]. usefulness, practice its in Despite skipped projects. often anal- future quite in are is guidance testing for throughout archived gained and experiences recorded it and ysed, knowlegde makes The also activities: [45]. This closure data. Reporting Test useful progress. and test [40]. accurate difficult the presented too be of to then deemed tracking stakeholders not, is and for if criteria measurement possible terminated, the the is or testing enables then executed, results achieved, are test been cases has defined met. test criteria criteria exit are more exit the criteria of either If group exit plan. the the test with until the results in test continued actual is comparing Testing requires evaluation reporting: Test and criteria exit Evaluating could cases test understandability, the improve Moreover, efficiency, reproducibility. execution into [40]. test simplify increase taken and logs to no be suites with also test inside tests should grouped to results be value test no recording This there’s implementing Additionally, by because environment. features account cases. test the test of verified these correctness a running and in completeness and them the examining executing at cases and aims test cases, logical phase the test transform to physical is and concrete task this reviewing into of goal by test The logical starts execution: and designing task implementation Test by This [40]. followed inputs unexpected is [46]. and it expected then both condition for documentation, test results. cases basis a expected project’s verifying input and the of of execution, composed analysing purpose for is case the postconditions test as has and A needed, It environment preconditions conditions. test test values, the and output specify cases to test and is high-level task design this to of as goal well The possible design: of and effect analysis severity Test the on nec- and becomes risks assessed tests the the [40]. on of failures based prioritization to determined the due is Moreover, testing, and exhaustive essary, strategy. of test planning nature general in impracticable activity the veri- main the documenting regularly The and be project. to determining the are around of plans development revolves the made throughout then All adjusted, planned, reports. and been report- those fied has involves on what also based and task decisions status the corrective This actual that project’s taking verifying plan. the and the between activities inconsistencies to test any conforms the ing testing manpower checking and in to time progress refers necessary practical control the project’s Test of process. planning mission test resource the of the as definition well for the as as testing, such of tasks objectives includes and planning Test control: and planning Test Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin1 128 / 19 Inspection Usability Heuristic-based of Automation to The order [40]: "covered". in as requirements [47] coined coverage specified is test tested the high been on achieving has rather in execution but whose assist tech- code, methods code, These the following the of accessible. on parts output be The based expected be to defects. the However, detect not system once. should the least tests of at executed the structure source be object’s of would and code test code existing the all the that on for require itself niques necessary bases is It it testing. thus, white-box code, named also is testing Structure-based Testing Structure-Based the testing are techniques the these simplify of that Some techniques [40]. all manner use testing reasonable to a Since if appropriate in [47]: impossible data following more tested. even input is (or software the exhausting choosing it the by too infinite), of process is is set-up combinations input inner input potential different the the requirements and examine specified data not the input do to possible They according the cases test product. tests. produce the functional and of and design tests help black-box can called techniques also These are techniques Basics design and test Definitions specification-based The 3.1. Testing Specification-Based into grouped also and are techniques, standard, structure-based this techniques, in Specification-based defined techniques. experience-based techniques are: These which categories needs. different project specific the on Testing Software of Fundamentals 3. Chapter • • • • • ttmn ntecd,o tlatapeendpretg falsaeet.Bfr testing Before statements. every all either of once, percentage least predefined a at least execute, at cases or test code, the the have in statement technique This testing: Statement realistic but cases, hypothetical [49]. test on typical steps than multiple relies complex through that more passing test usually require a are completions tests their is applica- These and test the cases. how scenario use about a or story scenarios extension, hypothetical system By a is used. context, is this tion in scenario, A design testing: to Scenario important well as states. is transition, of invalid it sequences and state, valid significant every current specific in in its state, as possible triggered every that in system state inputs the its and covering on tests events depending differ of might history system the a of or behaviour the Since value Testing: invalid is Transition an State it and then class, possible, said equivalence not the the boundary [48]. of inside exact outside from the boundaries the value choosing from the valid makes near one that choose or type to a on sufficient has values data an choose the of to If borders reasonable the class. by checking is produced when it cases detected test class, often equivalence the are to defects significantly Since adds partitioning. equivalence technique This Analysis: Value Boundary to should important requirements and also conditions is test [48]. all invalid considered. It is ranges, be invalid really it and problem. if valid check the same and defect, determining class, a that Before equivalence detects the it detected outside If from have valid. value would are another class choose members the of other members other that the all from representative then then handled one valid, are test is members to value enough its that it’s all if Therefore, class, that object. assumption testing the the by under way subset same the data a is classes. equivalence class different equivalence into An input the divides technique This Partitioning: Equivalence Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. d,add o aeasrn eednyo h eurmnsseicto.Sm ehiusof techniques Some number specification. certain [50]: requirements a following the meth- the activity on structure-based are dependency same design and strong the test specification-based a tech- experience-based repeating have the testing not by well formal do triggered complement other and only techniques ods, of is These radar fault in the times. a team under of the when fall of otherwise as knowledge would such gathered that niques, and defects skills uncover personal to the order on rely techniques Experience-based Testing Experience-Based uoaino ersi-ae sblt npcin2 128 / 20 Inspection Usability Heuristic-based of Automation Basics and Definitions 3.1. Testing Software of Fundamentals 3. Chapter • • • • • • etn,adass h esreprecdtsesi epn rc fterts rgesand progress test their of experience-based track in keeping pace. start in their quick previous up testers a from picking lesser-experienced for gathered the basis be assist the can form and or can thumb testing, They of rules projects. of other form in the experiences by take undetected can go usually These that testing. faults regular possible present can checklist A testing: Checklist-based and planning test less as involves defects, method This predicting execution. because techniques. test for testing more testers, skills of harnessed knowledge experienced good of for a set as a appropriate well requires only system one is the or exploring technique effectively input This blanc a testing: inserting Exploratory after behavior system characters. observing symbol reaction problematic and the some checking zero, containing include may by typically that skilled division cases situations test a for a such check of to of and experience creative, Examples the be detection. out to defect supposed most to is the lead tester makes the that words, technique other In test tester. a is This guessing: Error paths. possible aspect of challenging number very high A exponentially emerges. the flow a covering data time be Every the would for checked. technique path is this new code data to a the defect, in used, such path or detect possible and defined every to is decisions words, order variable of other In In series path. used. a specific is making a testing follow by flow execution produced code be of only flow the can having defects Some how testing: check flow then Data and conditions, decision. of the combination of assigning a outcome by the of be affect would part they this every doing to of values conditions way One internal boolean those operators. different decision evaluate logical to the by important separated assign usually becomes only are it instead that value), (and "false" or decision "true" a general inside partial a of exist complexity potentially the check by can not that However, does testing conditions decision and themselves. branch concentrates Since statements implied. method testing: is the Condition coverage testing all statement this high of a words, a instead coverage, other decision completing decisions high In of all a achieving result of decision. the execution branch, a a is the represents different for graph branch on the two stands in a edge in node every Consequently, graph, go each flow can control while a made. flow of be basis execution the to From code decision. has the decision time a Each paths, detected. testing: is decision fault and no entered and is then Branch executed, if-block the are found when code hand, be source other can that where the faults in on cases statements some but test the instance, entered, using all For not needing is code. if-block not source an testing of when statement lines only some to faults skips due multiple flow is this covered, execution are to This the code downside source detected. A the be easier. in process not statements the all might make if even to that created be be would might technique graph flow control a starts, Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin2 128 / 21 Inspection Usability Heuristic-based of Automation coun- its from points [51]: strong following different the holds are it these testing, and black-box terpart is examine to method second The the are testing white-box of disadvantages the Among [51]: points limitations. following its has also testing White-box [51]: following dis- the are and former advantages the their of testing, strengths black-box the and Among white-box detailed. on be must insight advantages additional shed to order In Testing discussed. Black-Box be Versus to White-Box need also each details with whose testing, compared combine gray-box somehow when attempt to to weaknesses possible order also and in it’s Consequently, methods strengths examined. two Basics be own these and to Definitions their need section points has in these 3.1. discussed method and methods was other, Another two and testing, 3.1.3. These specification-based section called in also 3.1.3. is described which been testing, also already is black-box which has is testing, and white-box as testing, such structure-based methods, testing called different using inspected be can Software Methods Testing 3.1.4 Testing Software of Fundamentals 3. Chapter • • • • • • efmla ihtesse en etd n ohv ra kls hsi teftranslates itself in This skills. to great tester have the to obligates and tests a tested, these cost. being needs by higher system into tests brought the complexity white-box The with good familiar effort. Performing be and time resources: of human deal and great time product. in the costs misuse Expensive to problems. exploited such detect be Vulnerabilities not also do purpose. might tests white-box environment intended product’sHowever, its running a from the lead li- deviate may from external system, completely originating on operating or Dependencies running partially different environment: to even runtime functionality or the tools, to party related third issues a braries, of on part focus some of missing Lack or of feature, structure a the of on identified. absence entirely not from focus are tests originating functionality, White-box Defects problem. code. the written this reveal the implemented, not not will were tests requirements of some If type functionalities: missing detect to Inability and testing efficiently for way parts. only its the all is of code execution source the manner the tracking understandable to an access in Having presented measurable, needed, stakeholders. be when to and should monitored, effort be testing should The progress coverage: the test maximum used). achieving never for but Usefulness executed also code are can the practices tests is white-box best (which Moreover, code more defects. dead testing, limit unravel early or help prevent and fix, quality to code development more in are good developers endorsed and of As importance defects improved. the detecting and updated of By constantly concious is quality: code code the overall mistakes, the made fixing of improvement the defects code. and the hidden challenge optimization of Code identify can structure can the that investigating tests output without White-box exposed becomes expected be it and not system. would code, data the that the of input completeness to the and choose access correctness having to By instance, for identification: easier, defect exhaustive in Efficiency Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. a egnrtdfo aasc sa rhtcuedarm[1,o h ito rtcl used. protocols of list the or [51], [53]: following diagram the architecture are an testing amount gray-box tests as the of gray-box such advantages cases, Therefore, the all data requirements. Among specified in from However, of generated list tests. the be beyond white-box can go the testers black- for by for provided known needed than details than of data information fully more less either to and are access program have tests, the testers box Com- of gray-box workings all, [53]. inner at system unknown the the or where of known testing workings black-box inner and the white-box about to knowledge pared partial has tester the where method a is testing Gray-box Testing and Gray-Box needed time the both the in that but latter summarized testing, the black-box be than than set. consuming can skill detail time it required of and the level expensive method, more deeper each generally a is of in first system weaknesses the and examines strengths testing white-box presented the on Based and weaknesses its at look [51]: to below imperative discussed it’s are points testing, these black-box of Some of limitations. advantages the listing After uoaino ersi-ae sblt npcin2 128 / 22 Inspection Usability Heuristic-based of Automation Basics and Definitions 3.1. Testing Software of Fundamentals 3. Chapter • • • • • • • o h etrt nrd ntesuc code. source need the no on there’s intrude definition, interface to on tester based the be for can tests gray-box Since intrusiveness: Non by checked repeatedly be might In paths paths. other execution some cases. data system untested, test all different paths the cover those not of having do functionalities cases to the test addition on structure, entirely internal its the focus examining on tests without decide black-box to Since hard coverage: also Partial specifications. It’s functional the cases. understanding clear, test clearly without effective not values is design input specification to appropriate system challenging If very becomes specification: it requirements concise on the reliance undermine Strong that defects to [40]. lead application often whole the features black- of supplementary by security covered untested not These are the requirements, On the tests. by box expectations. demanded not customer are exceeding that features signify extra contrary, necessarely not prod- the does beyond specifications functionalities Providing uct’s functionalities: extra unneeded detect to Inability cus- meets truly system a that show to perspec- order the in between required, separation is requirements. developer A tomer and view: user of the point of user’s tives the from it. testing use when system’s to Usefulness a not about how learning also of and way it, use simpler to and how faster behaviour, a or provides necessarily reading However, tests, not black-box functionalities. does its performing documentation) of understanding its needed (or the code with outdated developers the often provide is reading code simply legacy extensions, As requires of [52]: or code lines legacy understanding more of having functionalities. process the better program Simplifying or a more offers since it especially that easier, mean not becomes function- does the system Testing code, on a code: focusing Therefore, such source gets. of of program amount alities the substantial bigger a the expensive, with more programs becomes of testing rapid for Efficiency Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. oigsye hl yai etn a aiaesse ucinlte n loucvrdefects uncover also and and functionalities aspect system logical validate program’s can a testing about dynamic It’s information while general style, tests. provides coding dynamic testing performing static effectively summarize, ac- well. To for considerably described meant tests previously these are the plan process, to Even its important testing especially observing techniques. ISTQB and testing the dynamic object, of considered test tivities all experience- never the and are testing is running gray-box testing that by testing, based black-box software code testing, White-box evaluating or execution. on during violations, behavior based syntax is it, testing using without Dynamic between component code inconsistencies a source variables, and unused the or module examine [54]. uninitialized executed to a as tools of such on interfaces defects relies find the analysis can it program it, static of running hand, code discovered. source other they the errors the inspects the visually On share following or and the reads meet of team they a consisting Finally, then, separately. errors. procedure, devised, program similar uncover is a a to plan follow a order all type First in Every but operations: formality, walkthroughs. code main and of source peer-reviews, level inspections, the different as a of such have reviews examination of manual types many the are to There refers review reviewing are code performed is analysis. A this program which static by a methods execut- doing main without The or but code, code. tools, source the of actual help the with of or part any manually ing software testing around revolves testing Static the to Testing degree Dynamic Versus access Static and goals, cost, the time, Choosing black- the and 3.1.6). as [53]. section white-box such structure in between program factors (described internal ground on testing middle depends integration method of for appropriate sort useful some especially is testing, box testing gray-box conclusion, In might These well. as inconveniences [53]: below own described its shortly has points method following testing the gray-box include the hand, other the On uoaino ersi-ae sblt npcin2 128 / 23 Inspection Usability Heuristic-based of Automation Basics and Definitions 3.1. Testing Software of Fundamentals 3. Chapter • • • • httekoldepoie otetse ih eisfcet ftesuc oehsapoor means a has This code unstable. source is system. the environment if the testing insufficient, the across be if might propagate or tester quality, they envi- the throw to manner distributed provided to the knowledge a capacity the as system’s in that the well especially on as uncover depend exceptions, might to ex- helpful identification are hard defect (which become the defects because might tests, ronment, 3.1.1) gray-box achieved section on the relying in that When plained means identification: This tester. defect the in tests. to structure, Difficulty provided the information program by additional inner of run amount the be the about on not knowledge depends would coverage limited paths the execution to code Due many coverage: code Partial for function. as, a such by problems, handled uncover are to types likely data their most different focus are how testers that testing gray-box scenarios instance, system, test the the of checking knowledge on partial their efforts to Due design: suites test test Smart resulting the that ensures possible. as system, for unbiased the themselves. as challenging about tests are it’s knowledge the perform limited system, and tester create the they the when about Giving biased knowledge be to their not developers of and designers Because testing: in Objectivity Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin2 128 / 24 Inspection Usability Heuristic-based of Automation [58]. system the well covers its choose efficiently and to that important cases functionalities it’s test Therefore, implemented of system’s enough. set well some minimum the covered the miss and not if can time are especially tests qualities execution regression software defects, test small endorsed long introduced hand, to newly other lead the the may On of turn, these resources. then in tests, computational This regression of in large. exhaustion included is unmanageably case become test might possible every tests if current functionaries. projects, the new big on on the of depend working of should entirety When level tests risk the those the cover of on might to intensity and code, and important development of depth of therefore The section stage It’s tests. one regression in malfunction. in extension to system or section the modification distant these different of stop a the not effects does cause cause the defects potentially new parts, adding might changed Risking modifications the software purpose. code at re-testing intended small of its from process Even deviate the to it. to behaviour system’s to refers changes which tests some functional committing of after group a is testing Regression trying when confusion and ambiguity to [57]. lead them always may not test which not also to phase, sometimes are planning is qualities the testing these during for and specifications stated set, The clearly skill difficulties. specialized requirements technical highly to non-functional a due with possible testing even testers are Moreover, of tests presence functional the inexperience), ones. needs developer’s often non-functional (or over constraints budget prioritized and generally time to as due such functionalities. practice, its structure In does code system inner a aims the well testing how in non-functional on speaking, only check Generally observed to be [56]. other maintainability can the or that On extensibility, privacy. qualities scalability, even evolution or security, are are usability, there there as hand, hand, such one runtime On at attributes. observable quality requirements. as qualities non-functional to execution referred of also implementation are system requirements the non-functional the checking These if on check focuses to testing is Non-functional do. purpose to their it terms, from is expected those simpler that is result In verifying what test on does true. is actual tests always the these are of case, emphasis conditions test The comparison designed output. identifying expected each specified by of the documented starts with execution feature’s compared the It the on following based testing. chosen Then, black-box is data of specifications. output type and input a then functional is features, into divided software testing expected be Functional generally can tests. which of non-functional all or subtypes, and types testing different are There Types Testing 3.1.5 dynamic called is quality its check dependen- Basics to runtime and tools with Definitions some issues with or 3.1. together leaks code analysis. source memory the as Static Running tools such testing: [55]. while defects of cies compilers, uncover add types by can can used both testing however, algorithms support pos- dynamic optimization tests, to for as to used Static functions defects be similar much quality. can provide as tools tools software analysis Additionally, find validate to efforts. truly verification aim can the types tests to Both dynamic only structure. but and sible, logic code’s the from originating Testing Software of Fundamentals 3. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin2 128 / 25 Inspection Usability Heuristic-based of Automation [57]: followed testing sufficiently security are below, Furthermore, listed principles main non-functional. the is that legit- testing verifying of for of purpose functionalities type the intended holds this system’s Consequently, the maintaining modified users. or while accessed imate the being all from data parties, its unauthorized protects application by software a if controls testing Security performance for and tools text However, in effort. generally presented and reports, time helpful both produce forms. in and graphical information costly useful very collect be can can testing testing of type This some [45] under following the task are a testing performance tasks performs of specified subtypes system some main finishing the The for is, fast circumstances. time fixed how processing some internal check under the would long how tests or execution conditions, these and specified response instance, behavior acceptable correct For the its of maintain specifications to times. planned has the software within the tasks words, performing other system when measuring In and load. observing increased on an relies as under usually such behavior It qualities scalability. software and assesses time, that response testing stability, non-functional speed, of subgroup a is testing Performance Testing Performance process sys- the optimize the to of possible state it. becomes the improve it tracking continuously information, testing as and tracked continuous well the Additionally, examining as re- after qualities [59]. Finally, significantly software 98% tem. may some to and of 92% 15%, measurement about practice to frequent by testing 8% enables this time about wasted [59], by Ernst related software time Michael regression next development and duce overall the Saff the to David decreasing by migrating in Basics studies from newly helps and part to prevented then Definitions According as continuously, are executed testing release. 3.1. and is or incorporating and faster, build by early discovered achieved planned get is is testing defects and system If introduced testing, with process. functional associated build of tests software the form on of feedback a continuous is and It fast requirements. deliver to aims testing Continuous Testing Software of Fundamentals 3. Chapter • • • • hsas nldspoetn h rvc fteues(oa geduo extent). parties. upon third agreed an to (to disclosed users being the of from privacy data the protecting its includes protect also This should system The Confidentiality: in losses acquiring to before reacts changes system and meet the decisions to well make ability how software’s pre-emptively functionality. observes the to it assess order words, to in other is growth, In tests efficiently. these requirements of future objective severe cause The (or testing: system Scalability the crash of would purpose The that processing. circumstances data the increased failures). determine or to traffic observe is user to tests high order in instance, these workload, for extreme to, an under reacts it a it putting of how by system quality the tests the It in of testing: drop. reductions Stress aim quality possible that The when of as, degree load. the such anticipated and information, occur, maximum the identify functionality the under to under functionalities is its as tests performs well such system as the workload, well expected how average check tests These testing: Load Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. datg ol etegiigrl hycnpa nassigarae h at olanhwto how learn re- to wants then who Another absent, reader a or improvements. assisting weak in than are play problems can tests they more unit role guiding introduce of the likely series complement be would most tests the advantage these would if well code words, defects, how that other the finding note effort In structuring in to and important help time efforts. also the with It’s refactoring up most saves code the otherwise. on, wasted assists early been It problem have a advantages: would Detecting many propagation. have be their tests can preventing unit and unit of software states. single set and A good behaviour its A [60]. check paradigms) that programming tests classes other unit of of multiple set with case a function associated a or the be class, in might a procedure is it interface, sometimes a output an (or of or its be languages rest and programming can the examined, object-oriented unit under from is program working software code A when the isolated result. that of expected the of part with behaviour testable compared the smallest Consequently, the code. isolating source is the testing unit of purpose The Testing , system testing, integration classification testing, a testing. component in acceptance resulting testing, finally level, Unit and [46]. the following: with collectively the increases managed tests of and the consists of organized that scope leads be the which responsibility, where can development categorization that software a distinct activities to a testing with associated of is level group each Generally, a is level test A Levels Testing 3.1.6 in (discussed testing GUI 3.3). are section which in types (discussed testing evaluation Finally, other usability two and [48]. on 3.2) standards details section accessibility in with with focuses [48]. users conforms also standards of system chapter perspective of the this the there’s well set from Then how system specific verifies the a [48]. and tests following environment disabilities, which correct is intended testing the accessibility its system ensures is in the example testing well that Another installation works verifies it which instance, that testing For and conformance some system, and context. presented, the specific previously of types a the installation in to similar appropriate are only some exist, are types testing of variety large A Types Testing Other uoaino ersi-ae sblt npcin2 128 / 26 Inspection Usability Heuristic-based of Automation Basics and Definitions 3.1. Testing Software of Fundamentals 3. Chapter • • • • • erdmsaecno eyterrl nti aatase process. transfer data this trans- in a role of their receiver deny and sender cannot the message users. that ferred legitimate guaranty always its should all system to The other data Non-repudiation: In needed of data. availability intended the their ensure should to system access the have words, always control. should access performed of parties form or Authorized some accessed enforcing Availability: by be achieved be to can This functions users. allow intended their should by only system user, software a The functionalities. of its identity Authorization: of the one confirm accessing to program another able of be legitimacy example, the for or should, system unauthorized The an Authentication: by the created by be only should deleted data or additional modified new party. be no should Moreover, system users. the on authorized stored information The Integrity: Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin2 128 / 27 Inspection Usability [44]. Heuristic-based modules of the Automation with associated have risks integration compo- the appropriate them missing and an of architecture, the Choosing system one of approaches. the but behavior incremental on depends for modules, the technique useful two simulate particularly might between is tests interaction This integration the nent. yet, verify implemented to been has not test integration an If together. [44]: below tested described be briefly eventually are should them process or of a units, some individual exist, form said testing that tests integration between modules unit of interaction All the methods the while Many transfer. in fails exist data test likely actual integration issue the an an testing. in If that component [44]. indicates next or it tested combinations then testing integrated are pass, the modules unit tested, separate been interactions. to from already their have units extension units and of code logical modules individual the different a that the basis considered the all Given be test to can is testing level integration of purpose The Testing Integration [40]. testing integration modules. with between starting interactions before the tests in these lurk identification perform that the to defects on to preferred than focuses leads it’s rather testing Generally, This itself, of module separately. method a application This within the defects components. of of of testing component level Component the each [40]. on of effectiveness components occurs independently be but smaller the can several different testing and of unit the other, combination to each a similar are to or is linked modules) unit generally one (or are a components Components of These comprised software. testing. the of module parts developed called also is This directly that code Testing Basics any Component and files. Definitions tests configuration system unit 3.1. edits and from that long, separate or take network, to not a would advised over them communicates strongly running database, It’s the that accesses so frequently. concise, done and short be tests can more unit often keep are to tests important Unit It’s input). documentation. invalid the and than valid up-to-date of more examples and (with informative, code the with work properly Testing Software of Fundamentals 3. Chapter • • • • adihitgaintsig hsi obnto ftebto-padtpdw ap- top-down and bottom-up available. the become they of as combination tested simply a are is Modules This proaches. testing: reached. integration is Sandwich incre- modules tested related are those begins modules of testing that one of last where branches the approach the until then incremental mentally module, an integrated level also highest the is with the This of testing: summit integration the Top-down at the situated modules. of module level modules the higher level reaching of until lowest process testing continues hierarchy. the the integration are simplify of tested to way used be This is are to testing tests integration first these Bottom-up the then hierarchy, structure. where tree approach interac- a their incremental having modules, an hierarchy other module instantiate a can produce modules tions requires Since usually testing: only it integration can runs, test Bottom-up it bang once big The and developed, bug. a been time. of have execution testing cause modules long integration the all on trace after spent to and performed team time together, the the be grouped for shortens are difficult considerably modules becomes method the it This of but most once. approach, at bang all big tested the In testing: bang Big Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin2 128 / 28 Inspection Furthermore, Usability Heuristic-based of runners). because Automation failed test test and the whether frameworks comprehend testing to difficult as prove skill might (such larger it fails, a for needs test with needed tooling automated members an is more when team time as requires development well automation more a as test instance, words, Additionally, set, for other include, tests. In These initial feature. the disadvantages. creating own per its cost has development also initial automation high perspective. test evaluator hand, usability other a the from Additional 3.3.4 On testing. section automated focus of in particular subset presented a a also implies, are with name benefits the details, not as automation more or is, in absent which the testing, advantages is improves GUI documentation these automated indirectly the discusses on when members it 3.2.3 obvious team Section Thus, particularly newer is [61]. for This behaviour. up-to-date easier run. system long it design unwanted easily the makes can in and are that productivity testers expected manner, tests Moreover, both comprehensive automated are [58]. a understand tests Furthermore, versions in to the software suites different which effort. test on create at manual reused and speed a be fast of can the and that as repeatable, exceeds such far advantages, [45]. which numerous tool executed, the in of brings selection but the test, tests under influence object Automating greatly the can from software with software developed outcome separate the actual a some of the is properties compares simulates used the then automation tool tester automation an suites, The on the case relies test result. thus, execute testing expected human the automated or case, a hand generate, test by other plan, the to done a On tool is constitute software [58]. testing end-user that and automation the manual steps test cases, of implies, actions test understand test name detailed the fully performed the follows cannot As manually manually one of testing. who Therefore, number manual defining the entirely. first decrease testing without to manual replace is to automation not test of goal The Automation Test techniques testing 3.1.7 the Also, system. whole can the [45]. They constitutes experience-based object usually required). are the test upon are of The perform relied both else presence someone (sometimes test. witness The rather or test acceptance themselves, is this the requirements correctly. specified but doing the working defects, of for fulfillment is identify necessary the evaluate to solution is aim software end-user not or the does customer that testing demonstrate acceptance to delivery. testing, before expected and of testing types system other after performed Unlike is software the It whether [46]. regarding made criteria is acceptance perspec- decision the customer’s a satisfies Afterwards, the requirements. from user system the software on the based tive evaluates formally level testing acceptance The Testing Acceptance [44]. testers the budget, the of time, set available skill the the on section depends and in adopt, factor, described to risk testing (all the system testing of security Exam- types and which performed. Choosing testing, be performane also 3.1.5). can testing, testing software regression of be integrated types would supplementary fully ples many and tests, functional complete considered to a addition usually In on is Basics applied It and Definitions only general. is in 3.1. com- and system system system. technique, the well how testing with and and black-box requirements, others the a The meeting each is investigated. with product are communicate the requirements well ponents how non-functional check and to functional is purpose both level, testing this Within Testing System Testing Software of Fundamentals 3. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin2 128 / 29 Inspection Usability Heuristic-based of Automation [45]: steps following the and introducing through tests stage, pass development functional the should perform also matter automation as no can However, test well they phase. as and testing implementation, the phase, the during design during tests the performance analysis in dynamic architecture or system’s static analysis the a requirement validating do the in during help requirements can conflicting automation. they of test phase, identification from extension early by in and assist testing can from Tools benefit can development software in stages All Steps Automation Test considered be to need [58]: that factors following key the some are on depends which automation among test of Basics success the and of Definitions extent The 3.1. Factors Key Automation In Test [45]. GUI requirements the the on in focus change a a with to upon, due expended or more code, are test subgroup. automation the testing test in of bug limitations a the code, 3.2.5, source section the in bug a of Testing Software of Fundamentals 3. Chapter • • • • • olntol eed nisspotdtcnlge n niomns u loo the on also but environments, and technologies Choosing supported evaluated. its and on investigated, depends After- for, only defined. searched not is are criteria tool criteria selection a these clear then fitting aspects process, tools identifying testing and wards, the requirements improve specified can the analysing tools by where starts It selection: tool Test tests the a after applies. company even still a cost Besides, cost maintenance a test. can automated, it, automated fully use quality. single been to a have software employees producing the the without improves training purchasing sum and instance, and considerable For tool testing, automation testing. run, manual an on long than for spent higher the licences is time In investment the cost set. initial skill cost, the specialised However, testing a the and automation reduces time the needs it on tests tool based Automating the be cost: of Automation rather selection should The but popularity tests. its regression by for appropriate requirements. influenced more better are be are tools not Some others should performed. while be tests to tasks functional the tested, for for software appropriate the with being compatible as be well to as has tool chosen The capabilities: Tool development Automation time. for on resources in up computational saves large the significantly is frees and suite day This test the overnight. regression during executed the be If can intervention. suites. it human test size, without minimal-sized run using coverage can optimal tests Therefore, an Automated coverage. for test Be- aim better to cases. a is reflect some approach necessarily better in not the counter-productive does being suites, up test bigger-sized end and sides, exhaustive might their an for expending which requires keep becomes coverage effort, testers maximum it testing coverage, the higher easier extensive to a closer the reach getting size, to Furthermore, suite’s order suites. in test test but the it, manage smaller to The testers the size: suite’s test necessary regression is The this Additionally, effort. and time testing. relying in then continuous Therefore, cost adopting high, high. effectively on is also saving for is releases in executed or helps be builds automation to need test product tests on of regression number that times the of If number the cycles: regression of number The Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. opeeeso h etcvrg n a esr h xeto uoainrltv otetotal [63]: the metric to this relative represents equation automation following of the The extent evaluates ones). the manual percentage source the measure This (including of such can tests exist extent and of 3.1.3). coverage coverage coverage the section of test types to (see the Many coverage referring of function suites. percentage completeness test and a automated coverage is the statement running coverage the as by or test executed progress, automated gets the that all, coverage, code the of either First evaluating on quality. concentrate metrics testing software Most Evaluation and Measurement Test uoaino ersi-ae sblt npcin3 128 / 30 Inspection Usability Heuristic-based of Automation Basics and Definitions 3.1. Testing Software of Fundamentals 3. Chapter T T AT where: ae nhwmc ieadefr h uoainwudsv oprdt h lentv.The alternative. the is to not [63]: compared equation or save following test would the a automation using automate measured the to is effort whether index and of are automation time decision that much The cases how test cases. on of test based percentage all the among indicates automation which for index, automation suitable the is metric additional An C T AC where: • • • • oa ubro etcases test of number Total = ubro uoaal etcases test automatable of Number = oa etcvrg cmrsn h u fmna n uoae coverages) automated and manual of sum the (comprising coverage test Total = uoaincoverage Automation = nsaemitie oipoeteqaiyo etn,adepcal h fetvns of effectiveness the especially and testing, existing the of and quality added, the tests. frequently improve regression are to scripts maintained automation function- are new new ones release, acquire final continuously would its test These until output. under alities as software executed. reports the test are Since detailed tests maintenance: produce Test and the input, of as scripts data test automation some the require usually step, scripts this During execution: automated Test defined, are tests is which strategy and level, automation test [62]. test which tool on the appropriate performed, by an be to step, types this test the In containing design: and planning repeatability, Test task the effort, with of and functionalities time part the most the on the feasibility. to based consume technical is that refers and scenarios scope automation the the the priority, Determining highest of the automated. scope be The will se- that automation: specified system of the meets scope tool the a Defining that prove to concept of proof requirements. change. a lection for conduct readiness organisation’s to the the possible are as also well selection as It’s the cost, influence training might its that language, criteria scripting objective tool’s Other testing. of types supported uoae etCoverage: Test Automated uoainIndex: Automation AT T T AC C T × 100 × 100 Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin3 128 / 31 Inspection Usability Heuristic-based of Automation Testing GUI 3.2. test automatable of [63]: number relation overall this the to represents to referring equation compared percentage following automated a The been have is cases. that which tests progress, of automation number the the is metric additional an Subsequently, Testing Software of Fundamentals 3. Chapter otaetuymesteseie eurmns[0.Teeoe tsnee ocmrhn GUI comprehend to methods. its needed and it’s the limitations, Therefore, that its confirm [40]. benefits, hinder to its requirements needed partially testing, is specified or GUI the obstruct the meets to testing truly Consequently, GUI its software the all tasks. supports their in that performing mistake way from a user. a users the for in the and developed possible system is still software the it’s the between requirements, of interaction functional part the non-graphical encompasses the GUI if a even 2.1, Therefore, section in discussed As Testing Interface User Graphical 3.2 status as testing current well accordingly. the as adjust assessing coverage, in to immensely testing how help automated and all such the can metrics investment and Evaluation on progress, return cycle. automation’s automation life the software the the index, of clearly automation phases be the all to across as need tracked goals and automation start the the successful, mea- be at of to defined automation unit test the for Generally, order in and loss). summarize, To testing, potential automated (or man-hours. and effort execution in manual is and the of this Consequently, time for captured cost in surement case. the savings be test of compare to same extent to needs that the the considered manually determine of is measure case automation tests to test of these order each cost of In execute the frequency to expenses). to cost by mapped the the divided then on resources, profit first, based and of rather time ratio is but in the percentages, savings is similar percentage (which or the project. investment ones to a on manual correspond for the return not efforts to does compared automation testing tests the automated automated evaluating of of and value added tracking the in measuring help However, metrics found) presented defects these of quantity All the to only refers rather and defects, DD DT where: uoainefrsgvsacmayago esetv nteqaiyo t uoae et and tests stated equation automated the its follows efficiency of removal quality Defect their [63]: the detection. below of various defect on on under regardless perspective automation efficiency good tests of removal effect a evaluates defect the company un- to the a to used in gives ability variations efforts metric their detecting automation particular, a However in is execution. and efficiency of aspect, removal mode important Defect an is defects. tests cover the of quality the Furthermore, AT AA where: ubro uoaal etcases test automatable of Number = urn ubro uoae etcases test automated of number Current = ubro eet dnie uigtesting during identified defects of Number = ubro eet on fe eiey(hc sdfeetfo h oa ubro existing of number total the from different is (which delivery after found defects of Number = eetRmvlEfficiency: Removal Defect uoae Progress: Automated DT AA AT DT + × DD 100 × 100 Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin3 128 / 32 Inspection Usability Heuristic-based of Automation still [58]: automated. automation be [45] replaced, following or completely the manually in be discussed performed not are be which should can benefits, testing many testing GUI offers GUI manual of though even process However, the stated, previously As Testing GUI Automating of Benefits 3.2.3 [65]: following the for check might tests not GUI (but quality level higher testing Furthermore, system is the requirement at [65]. the important level) if that critically checking for user’s is exclusively of process the charge this from in Therefore, tests cases functional met. test performing sufficiently have on would focuses requirement testing Every GUI perspective. result defects, which finding GUI, to a addition [64]. in In combinations events possible and are of elements sequences number of event only high number GUI sometimes, exponentially high problematic itself, an the Such in manifests to due defect order. locate, A specified to a defects. difficult in uncover generally events to GUI is certain testing triggering GUI by of goals the of One Testing GUI of Aim values of 3.2.2 set state. The GUI the button. constitute a execution, pressing during or element, automated click outcome. GUI An expected Testing a each the GUI of as tool. with properties such result a the 3.2. events the of on compare GUI help and simulate object the also GUI with can a to automated Automation input be predefined or a give manually can of performed test properties be the specified func- can validate the testing the to covers GUI order meets in entirely it GUI that the ensure [27]. cases exercises elements to test fully GUI GUI and of major system software’s set the a a by performing offered testing tionalities implies of this practice Usually, the requirements. to refers testing GUI Testing GUI of Definition 3.2.1 Testing Software of Fundamentals 3. Chapter • • • • • • h vrl eino h U n t oossol eashtclypesn.Tefn used concise. font and The correct be pleasant. should aesthetically text be the should and colors readable, its be and should GUI the (and of sizes design overall screen The different to responsive be should elements GUI resolutions). screen of size be and also position and The quality good aligned have properly should be used should aligned. images and properly Moreover, consistent, horizontally. be and should vertically elements GUI between to distance relevant The text have should and moment occurred. correct that the error at the displayed be date format. should Furthermore, date messages correct text. Error the submitting in from presented user data accept the accepted prevent only are should should numbers fields system only input the if then instance, For field, validation. a of in form some have im- should fields. especially fields Text input is text This of width acceptable. become and be or length overlap the should not for positions should portant new elements their GUI different Besides, the resized, unrecognisable. is window GUI the After Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin3 128 / 33 Inspection Usability Heuristic-based of Automation Testing GUI 3.2. Testing Software of Fundamentals 3. Chapter • • • • • • • ec esrsteetn owihteeeuino etcs eed nohroe.For ones. other on depends indepen- case Finally, test a changeability. be of its execution can the increases test using which case a to instance, which test extent For to the a data. measures extent in of dence the structure constants to and of refers representation instead the changeability variables on hand, depends other comprehend highly the and It On read changed. to is. easy how case to test refers changeability a understandability, understandability time are First, testing reusability in [67]. test coding significantly independence improving up and some factors saves the respecting which of reusable Some and highly cost. practices suite and test best a testing make some can principles Following same the reusability: repeating case comparison, Test exhausting. In more the much suite. in is environment changes test multiple various in making in same scenario test require the manual same only executing the usually then testing would file, of tests need configuration tests GUI in these when automated more the true as increases especially environments, tests is for manual This (except of cost effort been repeated. the and have hand, are time cases other in the test cost On these additional regression maintenance). Once in without test present continuously release. be run the each should can into after cases they quality lurk test automated, software might GUI high defect repeated. ensure be some to to code, need testing, source tests old the faster. the process in and the change system make every to After computers multiple repeatable: on Being power run be computational also the human can when tests require overnight These not tests needed. does the not run tests is can automated tool of automation execution an the intervention, Since unattended: tests Running performance GUI complex performing automa- repeatedly an of by hassle manually. the correctly tests up performed saves be This can localised [66]. activities, tool other timer-driven tion the on interac- complex check rely the requiring easily that applications then sessions can of elements, tive testing GUI, graphical performance one their However, GUI checked in even tester. Besides, that not versions. a and case used for test language difficult GUI the extremely automated in be same differ can test- only manually GUI GUIs as a these such if of tasks versions Some localised manually: different done if ing challenging are that tasks without Simplifying details. time in every results correctly repeated the is steps be report same actor to and has the human log and perform to complicated the can forgetting is tests manually, task GUI the test Automated when a times. true many particularly performing is When This coverage. errors. test errors: to in prone of increase an data, number to test the leads of this Decreasing variety All a data. generating combining such as of to well effect responses as the system events, analysing verifying GUI values and in of expected assist sequences can the the different automation holding on running Besides, always task and examine executing. are performed also is states the test or object of the effect internal product, as the some software as if a such or of behaviour, memory, system’s functionalities and structure the of automated program checking scope An inner in manually. and the challenging depth help considered the can be would increase suite that can test way tool a test in tests automation performed An the coverage: to test cases, the test Increasing the design wisely and might cost. tasks, im- tests maintenance stable therefore high these it’s sufficiently avoid counter-productive, maintaining automating being Also, choose of extent to the counterparts. portant automated to manual execution, consuming for their time ready scripts highly than are test become and faster the written run preparing been that have tests significantly they noted GUI once in be but cases should time, It test extra execute up them. takes can performing tools manually automation than Test time less time: execution on up Saving Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. datg.Ayrarneeti h oiin fteGIeeet a ra h et,adwould and dis- tests, the major anew. break a can cases holds elements test still GUI the the it recording of programming, require positions like perform the phase to skills in second easy rearrangement specialised is Any the some method advantage. Afterwards, having this the demand though case. Even is not test cost. does phase additional the and first any performing without the endlessly manually of repeated requires execution be it can The as testing. part, regression costly GUI only have timing. automated that same events supports the the method with all This and replays manner and same source, event exact during an the Afterwards, as in events. itself recorded occurring registers been the tool of phase, the details recording the phase, the all playback during method the listener notification event event an its as to itself adds registers and tool and automation the movements the this, process, mouse achieve including To replay back the played human during and direct Next, recorded any [65]. be without presses tool. can test key actions the under user application of All the help manually on intervention. case with steps test stages. steps pre-recorded GUI two the the his of executes executes all has tester computer the it recording phase, and while this tool In the automation process. all test capture a the with called is performed first is The method replay and capture The Method Replay and Capture the method, replay and capture the as method. such script-based cases, the test and GUI method, automating model-based of ways various are There GUIs Testing Automating of Methods 3.2.4 with associated is being or activity critical the business if considered is preferred and case also changes, test is risks. frequent the automation high to more running test subject being if GUI tester not as manual Moreover, is the well case tested in test mistakes). as results GUI repeatedly, make (which a difficult it to more words, execute likely or other automation to consuming In for time need test. more suitable a is GUI more there’s manually a is if not task or automated, testing automate better to a is whether if deciding determine when that apply rules also general the summarise, To uoaino ersi-ae sblt npcin3 128 / 34 Inspection Usability Heuristic-based of Automation Testing GUI 3.2. Testing Software of Fundamentals 3. Chapter • • ciiis tsbte ofe h ua eore n iette oadsligmr chal- more solving toward set. monotonous them skill same direct their improving and the or resources repeating problems human of de- lenging the instead and free tasks testers to other automated, better on are It’s focus tasks activities. to more time As more tasks: have other velopers for products developers delivered and on the tester wasted in Freeing reputation). been trust company’s have a decreased would ruin a might that by failures cost software are caused extra some defects losses (since the more potential coverage, of by even test conservation Additionally, or the the debugging, effort. increasing to human by leads and in which decreases, cost found, also the ad- cost on an the up without time, saves decrease. repeatedly saving This run not can does effort. tests testing programming GUI GUI ditional automated term. manual implemented, short for the once in cost comparison, cheaper In the testing test GUI progresses, scripting manual development and makes testers, as that training However, sum tools, considerable new a cost Acquiring can cost: cases testing in decrease independence term its difficult. then Long is finishes, case case test test the another reusing after and run poor, be is only aspect can case test a if example, Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin3 128 / 35 the If Inspection is appearance. Usability to Heuristic-based method external easier of its this becomes Automation approach than behind this rather idea then GUI concern, main of the The separation of supports aspects used capture. architecture underlying software’s event the called on concentrate also to is approach called is script-based technique This The GUI scenario. of usage sequence certain ordered a [64]. mimicking an sequencing at Triggering event aims GUI generally the component. case in GUI test changes a a by affected of in events get properties key usually trivial of not other properties do they main or thus, the position result, tracking the events, and observing GUI and collaborate of components, developer sequence GUI change. a the of running and accepting involve more usually tester and tests the robust These more both be and would This tests level Additionally, written [64]. abstraction the tests. manner elements then, high regression properly, similar GUI as communicate a of efficiently a have and properties in tests repeatedly the run these run to would can if belong that tests checked cases scripted pro- values test The yields the writing approach involves that tool. except generally automation tests, it the unit approach, by to automation executed test be script-based to the grams is method last The step Method test Script-based single one is recapturing effect to This leads anew. case element test GUI [69]. complete single efforts the when one recording maintenance and of changing test possible instead that is reducing fact approach the significantly based to in model linked result the can with correctly method replay done and capture the Combining sig- [68]: a points methods. alternative following require over the them advantage include of any gaining benefits all however, before such model, test, Generally, the GUI building based in model effort a initial nificant performing of ways the various building are inconsistency. while There an occurred have requirements error gathered the an behavior the by that the that caused possible then or generally also fails, model, is is which test it expectations, the However, model’s If the the implementation. match running outcome. erroneous not after expected does and the application model, tested with this Testing Follow- the compared for GUI of model. is determined the result are 3.2. from actual outputs generated expected the is and tests, suite is inputs test on the model a (depending step, specifications, the visually the this represented testing, be on ing model-based speci- or based program for the Then, textual support on used). a as with based tool written the tool cases be a test can of model using efficient This representation step, of [67]. graphical first generation built a a pre- the provide In in and can assist examining requirements. model also fied in a can helps it Since This behavior, system’s application. a the testing. of GUI behavior model-based the is dicting method automation second The Method Model-based Testing Software of Fundamentals 3. Chapter • • • • etr en diganwato otemdl n yrnigti cin twudbe would it action, test this generating automatically. by running covered gets new Therefore, by feature a and actions. new adding the running model, cases, existing since the other to design, with action fluid combined new from automatically a benefit adding would means application feature the for identification. defect tests in Generating efforts continuous and to generate due achieved to is added. easier coverage feature test are new Higher every cases for test tests new the write because to needing decreases without maintenance re-generate test for once. cost only The done is requirements the from derived rules the Specifying Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin3 128 / 36 if summarize, Inspection To Usability Heuristic-based it. of Automation performing manually than time more requires producing case that test mentioning GUI without goes testing a This the for prepare case. script and test tools a first acquire the previ- to writing investment As even initial before automation. time high infrastructure test to a than due effective requires possible cost automation not more stated, or becomes ously required testing not is GUI cases manual test then GUI constraints, of execution repeated regular the on if tests [70]. Furthermore, such defects of manually more execution to identify the sufficient repeating to is since unlikely It tester, is qualified unnecessary. basis a and by consuming GUI once different time tests the is these all application’s guessing perform automate the error to Trying on on vary. Depending based methods GUI. cases testing a test suited most of the robustness context, the in its check or proficient nature to are ways who task quick testers the have Experienced guessing performing circumstances. error special without under other case only In occur test handled. problems GUI Some been erroneous has be defect an [58]. won’t the identify once it if least to even and at exist, fail possible pass, manually still already be to also defect come won’t would a never it case if would test words, order written Otherwise, the in been if performed. done know have successfully is to would best possible be This that a can case considered automating. task test is before the automated once it the that least automation, sure at for be manually suitable first case completely to test is the case execute situation. test to the practice GUI comprehend [70]. a defect to if a manually even reports GUI or Moreover, documented. tests the not automated are test the GUI to adjusts in either compelled he changes to himself improvement Afterwards, the new finds if a tester further of the itself to matter difficult manifests a Then, sometimes it’s problem it’s whether This execute, or to the application. encountered, fails of been the case GUI has test the issue a on an When tests whether regression challenging. identify performing very version, be software can certain version a next [45]. of stability release of level the consider acceptable after only beneficial an Even and more reach manually), it’s requirements (or would case, method the replay when cost particular and automation this maintenance capture GUI In the test script-based with suite, occur. the interface test GUI graphical Consequently, in the unusable be- test changes an to cases. to the or frequent it test vulnerabilities the more cause the in possible the would replaced adapt with increase been case continuously code has test to case the GUI test better leaving scripted the it’s of in a Instead needed adjust reliability object to code. GUI and failing a source correctness when cases, the as some adjustments such endanger In non-executable, for adjustments which come these need negatives, suite. Neglecting the false test [70]. in or the anew nature positives result of tests the false might the to redoing in code or due result source However, suite, might their test in results. scripted previously change test the accurate a auto- in The producing GUIs, in of updates. reliable code properties source be and frequent must cause might suite requirements of GUI test in method mated a changes the frequent in on words, (depending rearrangement other function small In to a cease What’s Even to partially. cases only test change. [70]. occurs automation) GUI the requirement written Testing of a previously GUI in frequency some modification cause 3.2. the the A is if frequently application entirely. even important the testing true of more still manual requirements functional is replace the This to if automation, aim change. for not suitable should not is automation case test test GUI stated, previously As Automation Test GUI of Limitations 3.2.5 simpler be therefore [64]. would test and to logic, business easier the and from read isolated to be would GUI the because follow Testing Software of Fundamentals 3. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin3 128 / 37 Inspection Usability Heuristic-based of Automation and evaluation formative called [72]: is evaluation first summative The is evaluation. second usability the user-based of types two are There Evaluation User-Based centered is [71]. approach described model-based be third or can the expert-based evaluations finally user-based, usability second Therefore, as and the interactions. experts, end-users, human-computer usability on of focuses of models approach around knowledge first used the The data on GUI. of relies the type of the approach on quality based usability usability, the evaluating checking for that for approaches methods main evaluation three various are the there Generally, on research. rely engineering to such usability how collect rigorous accurately and through developed to support when were would order comprehend system In to the errors. effectively important from how it’s recovering also data, and and tasks easy software, their how the GUI, performing using a in about with end-users learn interact to would end-users them the for well is how it on is evaluation usability of 2.2.3). relying, focus the section The (by through in quality discussed usability GUI, metrics the system’s the of a on measurement instance, the of for through assessment and usability issues usability of of purpose identification the serves evaluation Usability Evaluation Usability Defining 3.3.1 process tests. the functional makes from which different knowledge, very and evaluation is skills application special usability requires an of and of manner usability special the a evaluating section in requirements, in performed non-functional explained most previously like (as just requirement and software 2.2.1), non-functional important an is Usability Evaluation Usability 3.3 and interfaces, cases. and party test third automated specialised various reusable more less on the in gets counterparts results that it it their means dependent consequence, more than This by the tests becomes, [70]. automated software software a their customized customised might reusing producing application on in tested test focuses the chances Evaluation a independent of that Usability better and reusable properties generic how have develop the more 3.3. that However, to solutions the Organizations directly software gets. automation. words, relates test testing other of of suite applicability In cost the test influence the written become. less a the can of is, automation suite reusability in the investment that the not note profitable also to is important tasks these it’s for Finally, automation test then needed, not is [58]. tests needed these of execution repeated the Testing Software of Fundamentals 3. Chapter • a eso.Eauto ehiusta xld h seto okn ihrepresentative with working of aspect the fi- exclude the gather that producing to techniques until order iteratively Evaluation in prototypes GUI, version. future a of nal through design the tasks improving specific for performs with information with that working cooperating sample involve and when techniques user development, testing useful representative formative of significantly a that end proves note the to which the important until manner, It’s evaluates stage iterative prototypes. process design an This the in GUI. from occurs a starting often also of software design is the the (which of forming evaluation usability in formative helps implies, testing) name formative the called As evaluation: usability Formative Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. prah h rcs rfrbyicuesm omnsesta r ttdi h olwn or- following the in stated are that steps selected common the [74]: some on der include depending preferably differ process might the process evaluation approach, usability steps [71]. the consuming first though time the even very however, Furthermore, are counterpart, validation its their than of and expensive number models any less accurate of process creating usability the in the makes testing repeatedly This have for models designs. used the be GUI Once can they behavior. validated, user and predict prepared to been models on relies evaluation usability Model-based Evaluation Model-Based encounter. This might end-users experts. real usability that issues of evaluation usability these skills However, the perform. and all to uncover knowledge easier cannot and the techniques consuming, can on time These less rely expensive, or less evaluation. them standards, expert-based makes and for guidelines techniques on also rely are there evaluation, user-based Besides Evaluation Expert-Based no and users around alternative. the centered to are superior methods declared validation testing be and Both can verification approach 3.1). (with specific section evalu- design summative in final the described the for while concepts validation verification, being usability usability a of delivers role approach the ation plays evaluation formative short, In uoaino ersi-ae sblt npcin3 128 / 38 Inspection Usability Heuristic-based of Automation Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter .Dfietetre srgroup. user target the Define evaluate. 3. to GUI the of aspects which Determine 2. evaluation. usability the of goals the Specify 1. • eepoe omaueuaiiyi aoaoyevrnetta re odpiaerealistic duplicate distractions. to the eliminate tries and if that environment use, out laboratory of find a conditions might in to techniques usability is evaluation measure summative to purpose Therefore, employed Their be goals. tasks specified functionalities. The some system [71]. meets main participants design groups. 7 GUI the these to depict 5 of performed about one be by each represented to than for is user representatives end-users manner (or of include formal groups class to user more each Typically, has different a tests at in targeted summative performed is then developed is classes), software and the development, This If of design. evaluation. GUI end formative a the of degree towards satisfaction done assessing user docu- is the and on and evaluate efficiency, focuses techniques the effectiveness, testing It the Summative ment designs. testing): GUI summative near-complete or called complete (also more evaluation in results usability typically Summative also This and GUI. small, the the of be [71]. for part to mechanisms Consequently, specific reporting needs a test sample at pace. informal directed user fast be the a to frequently, usability. have in more tasks measuring conducted the and about be quickly not to happen eleminat- it’s tests to for and these evaluation data for iterations, collect usability better design to formative it’s future of is Therefore, in part evaluation not problems formative also usability of are other goal ing surveys of for the as design applies the Furthermore, such engage techniques same directly evaluation. that or The means tasks perform This [73]. not GUI. do testing a users formative the which of in part methods evaluation considered not are end-users, Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin3 128 / 39 Inspection Usability Heuristic-based of Automation Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter eet htmk sblt et ot h fot oeo hs eisaetefloig[][34]: [8] following common the are some merits hold various these all of with Some still evaluations effort. they the other, usability worth each tests of usability with methods make compared that and benefits when types weaknesses and different points many strong are there though Even Tests Usability of Benefits 3.3.2 are instances classification. evaluation method usability depth in concrete the more More at and a made constraints, alongside performed. be time 3.3.3, be the to section budget, to decision in the tasks described a as the is such factors project with various a associated on of risk depending quality project a usability of the start evaluate the to when and how Choosing 2 rsn h vlainrslsi tagtowr n nes oudrtn format. understand to easy an and straightforward a in results evaluation the Present 12. exper- the repeating instance, for by, process the iterate completely or partially needed, If 11. improvements. or solutions usability Suggest 10. .Eaieaditrrttegtee data. gathered the interpret and Examine 9. data. usability-relevant record and Collect 8. experiments. the Design 7. performed. be to tasks the Choose methods. of 6. combination the or method evaluation the Choose 5. metrics usability the Choose 4. • • • mnst e ftesgetdsltoswr,o ygigfrhrbc ntepoesto process the in back further going [75]. by evaluate or to GUI work, the solutions of suggested aspects new the determine if see to iments sr ol eal opromterdiytssfse.I h otaebiti agtdat the targeted use is to built how learn software quickly the would If employees new faster. that tasks important daily end- it’s its their then software, perform users, a to business of able value be usability would the users increasing By productivity: issues employee these Increasing solving immensely. and costs detecting post-release contrast, on translate By up would save cost. issues would preemptively usability support that customer means support higher This more live into the support. some software, directly their technical a or contact of using to website, when use are a facing they correct are e-mail, likely clients and through difficulties delivered effective more the be cost Generally, can making software. It compa- in which services. customers by and its services products to to assistance refers support provide Customer nies costs: support customer Decreasing user improves greatly Usability which liability. issues, a design become solving satisfaction. would and feature identifying particular figure in that cannot help efforts feature, users evaluation of a use the percentage use can’t big properly using they a to that if if how point words, out satisfaction, the other to In customer effectively. issues features low usability features software some have excellent some to offers still due service end-users can the or it frustrates product software functionalities, software needed and a intentions if much purchase higher Even and customer a higher rates. to to turnover translate extension, would client company by that lower a and because of clients, satisfaction, services their customer or of of requirements rate products the the exceed for or important meet very to It’s satisfaction: user Increasing Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin4 128 / 40 Inspection Usability Heuristic-based of Automation Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter • • • • • • o vr n fispout ob fhg ult nldn srfinlns.Usability user-friendliness. including quality high of easy be sufficiently indispensable to it’s be products reputation, to company’s its a GUI of improve the or one for protect every is to for order expectations in cause these development Therefore, since of might losses its use. one expectations financial to by Generally, huge customer to received costly. leading meet company, well very the to were is in failing trust products their industry, previous lose software to its product clients the all failed In if one even just customers. reputation Sometimes, company’s not reputation: a company’s are ruins the who improving those [78]. or literacy, equipment Maintaining low legacy on with working those those would or as accessibility language, and such the usability in disabilities the fluent the without improving are users words, that other benefit situations In also and [77]. design. users software customers inclusive to offering disabled of apply their Companies focus also of loyalty requirements disabilities. the accessibility retain with several usable to However, users tend how for accessibility addresses good is it with service and solutions or usability of product subset software a a the is guiding Accessibility in accessibility: assist Improving they result, features. a these implementing as towards and efforts users, design functionalities aid and losses determine development significantly of also that risks can approaches reduce evaluations also design these they or Furthermore, thus resources. and of competition, help misuse the would by and learned made lessons mistakes The are, the companies features. avoiding offered other their in of accentuate weaknesses designs and their competitors. strengths well its the how of what designs and of the understanding to compared an company for a allows is by evaluations It used expert- usability design competitive but GUI of certain test, purpose a usability assess The any to appropriate. to most applied the for be are However, us- can methods competitive evaluations based perform competition. of to type the advised This it’s with value, evaluations. advantage it ability highest compare the clients constitutes yield to This when efforts company usability ser- described). previously or a (as product for improves software advantage also a an satisfaction of user usability overall the the increasing vice, By advantage: competitive of a methods Gaining (such user-friendly designs, more 2.2.3). different stop is section assess in one to to discussed which way upon are numbers relied measurement great with be prove a can and metrics are them, because tests compare team Usability a in design: arguments design regarding disagreements his team in Resolving him [76]. guides trust service design user or the and product offered sales effectively client the increases how the a in usability and in easily improving Consequently, for, is how process. product searching check decision he’s the would purchase evaluations product if usability the accentuated especially find instance, is For can This e-commerce. customers needs. of by their field accepted with better well are resonate products and user-friendly Naturally, wasting revenue: avoiding [76]. the for need Increasing useful not do also users is that and features products, on be similar effort can documen- and other evaluations detailed time of usability for designs through need future gained the documentation in knowledge quality, for applied the usability goes Moreover, high same minimized. a The is has tation software redesign. the and ensuring development by of prob- cost, cost usability of the detection reduce early The can costs: lems and productivity. efforts their redesign increasing and efforts development thus, usability Reducing time, Consequently, less in recovery. make work do the more users doing in if in them and employees assist important, help should is prevention software error the Also, mistakes, tasks. their perform and software Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. o culuesitrc iha nefc nodrt eemn sblt rbes hr are [79]: There following the problems. are them usability of determine some to class, this order to in belong interface that observe methods an evaluators concrete more with many or interact one having users by service actual or how product a evaluating to refers testing Usability Method Testing The which [79]. categories simulation different and five modeling, into analytical methods inspection, inquiry, engineering univer- testing, the usability of are: all Marti classified Hearst and California Melody of Ivory de- sity Researchers more exist. another categorisation However, in-depth evaluation approaches. and these model-based tailed classify measur- and to and possible expert-based, issues, it’s user-based, 3.3.1, design into section solving methods in needs, described their As examining improvements. tasks, usability collecting their ing for and methods users evaluation about various data research useful of decades over developed engineers Usability Evaluation Usability of Methods 3.3.3 their as well as techniques. im- evaluation applied, it’s usability they’re other when offers, to and method compared how differences assessment and methods, every similarities evaluation concrete what examine and to evaluation portant usability comprehend further To uoaino ersi-ae sblt npcin4 128 / 41 Inspection Usability Heuristic-based of Automation Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter • • • • eoe aiirwt t fewrs enest eciehwt efr h ak to tasks the he perform until to system how the describe with to interact needs to he has Afterwards, participant it. the with step, familiar first becomes a In expert The method: Teaching experiment. the users in the participating influencing tasks. without not their evaluator, is the performing to are who participants who user then the expert of question, behavior a an the ask at explain to would it needs evaluator direct the com- to if which has Therefore, normal, any he from results. because deviate test participant, or of the validity change with the to that interfere promises behavior suit user’s not better the to to cause is design might shadowing the workday interference adapt for his later rule throughout and main details, participant The in the behavior. behavior environ- follow natural user its would comprehend within to evaluator used order the being in that partici- is service means the or accompany This product observer the ment. the how have examine to to order is in shadowing pant might behind that idea The GUI test. the shadowing: aloud Job of thinking aspects simple a about the during particular how user about in the data and by gather users, neglected to be is by purpose interpreted The being product tasks. is the their about design performing questions are some they proto- participants time aloud the the thinking ask during the frequently to would evaluator extension the an as considered col, be may This [80]. protocol: there asking out Question methods evaluation Therefore, popular cheaply. cogni- most rather users development the the of among about phase is insight any technique valuable during this provides done This be including can tasks, feel. and performing and processes, are do, tive they think, as Participants at, mind design. look their a continuously they to about what while comes users whatever of system say opinions to the true ask encouraged the evaluators using are knowing usability enables tasks the It representative test, loud. perform out aloud thinking thinking to a participants in test implies, name the the As aloud: Thinking to attribute quality important this of delivery the end-users. ensuring for means the provide evaluations Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. oml o acltn h ecnaeo sblt susta a eietfidi usability a derived in they identified that [84], representing be function Nielsen can mathematical [84]: The Jakob following that the participating. and issues is users relation usability Landauer of number Tom of the by percentage on depending the study test, calculating a for in formula that, a note to important It’s N x where: uoaino ersi-ae sblt npcin4 128 / 42 Inspection Usability Heuristic-based of Automation Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter steuesnumber. users the is eesTettluaiiyise number. issues usability total The refers • • • etn sa prpit ouinfrcmaiswt iie ieo ugt swl swhen as well [83]. as locate budget, to or hard time are limited with participants usability companies willing Remote for counterparts. solution appropriate moderated an their unmod- is than Consequently, testing verbalisations questions. less real-time contain answer labo- tests a or erated of tasks ask that complete to to participant having similar the without manner test, a remote independently unmoderated in an questions and in ask However, simultaneously or environment. communicate online ratory to present moder- them are a for participant in possible the testing. is details, and usability it further evaluator remote in the explain for To testing, support remote unmoderated. with environ- or ated tool natural moderated a be a or can in software technique behavior sharing This user screen of a observation using the by allows ment method This validate testing: cross other Remote and with behavior combined user be of can understanding [82]. analysis better results file a evaluation Log produce pro- to desktop release designs. methods product than future after evaluation applications usability superior evaluate from web to visualize differs is for help method path easier this and navigation of is user’s purpose the data the part naturally, if of which and or as grams, sort user, such this the problems Gathering uncover by to ignored expectations. analysed repeatedly be get can GUI are data the errors This and of paths, files. navigation log clicks, as as examination recorded such the all information through instance, behavior For user logs. analysing interaction involves of technique This analysis: file Log qualita- method. to aloud compared thinking high cost the relatively the as is However, such evaluations studies methods. these tive formal performing less for us- through required identifying discovered effort for be also and not and might the met, that extension, been different issues have by of requirements ability testing also certain comparative and that for user, validating useful for the very designs, influence are partici- not measurements the Performance to with order results. interacting test in avoid The experiments completely the or measurements. during minimize collect pants to be accurately advised to to are strongly environment tests is The laboratory evaluator 2.2.3). a section in in formally discussed (as conducted as- metrics on usability based the predefined data of quantitative about valuable sessments provides data technique more This capturing measurement: Performance to leads which tests, far aloud speak [81]. usually thought thinking design participants cognitive the the his during method, uncover teaching than to the more With order strategies. in the search encourage verbally or to himself processes is express technique become this to not behind participant purpose and true participation teaching the his because limit solver, to problem has active user an novice the Naturally, user. novice another f ( x = ) N (1 − (1 − λ ) x ) Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin4 128 / 43 Inspection Usability Heuristic-based of Automation study, particular this In λ 3.1: Figure Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter vr h vlaoscnrl niqiymtosi h al tgso eint netgt and investigate to design of following: stages the early are methods the these in the of methods Some after inquiry needs. performed on More- user be assess releases. rely can future can of they feed- evaluators decisions or design their the guides testing from over, that usability learn data supplementary to with gather order combined to in release be GUI product can the methods of users aspects These asking various by about back. information opinions usability-related and subjective views their collects about methods usability of group This Method Inquiry [85]. The therefore uncovered It’s been have system. issues lab the the had a in of engineers of wanted 100% they usability that outside certainty that problems the with problems the experiment, claim usability of to this of possible 100% number in by discovering the However, represented that over be control noted prove. total to be to needs also challenging the group should when is each It is environment then, exception users. groups, An 3 distinct [85]. least design highly iterative a at into of divided if context perform identified. are Even a to users preferable in be be investment. participants target can would 5 it on problems with users, return 15 tests the with usability biggest experiment 100% of 3 the the uncover perform 85% yield would to budget users, users can the 5 has 15 participants of company just on 5 x-axis relying with with The that However, testing number derived Consequently, 3.1. the be problems. shows figure can usability y-axis in it the the shown graph, while of curve that issues, Examining the usability found in users. of results represent of percentage This graphically the to represents function. enough graph it’s the the therefore of negative, input be positive cannot the users of number tests. previous the in Naturally, found been yet not has it that fpeiu rdcin rm1 atsuis hr h ubro atcptn vlaosrange evaluators participating (1 of number the where 77. and studies, 11 past between 11 from predictions previous of stepeitdpoaiiyo nigauaiiyisewietsigwt naeaesnl user. single average an with testing while issue usability a finding of probability predicted the is − λ ) stepeitdpoaiiyo sblt su eann nicvrddrn et given test, a during undiscovered remaining issue usability a of probability predicted the is uv hwn ecnaeo on sblt susi eainwt h ubrof number the with relation in issues [85]) usability (see found participants of percentage showing Curve λ setmtdt e"3" n a eue ycluaigtema value mean the calculating by deduced was and ".31", be to estimated is Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin4 128 / 44 Inspection Usability Heuristic-based of Automation Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter • • • • • • olcigue edak h sri ret hoewehro o opriiae n the experiencing. well and be as might participate, ideas, user to design the new not frustration generating or any in voicing in whether specialized in useful, choose service as surprisingly to party be third free itself, might a is GUI gathered to user the link information The a in as introduced feedback. or often user submit-button, is a collecting feedback and user text-fields for about of feedback Possibilities form send in service. directly a to or opportunity the product users a provides method This post, surveys. feedback: by internet User users by to replaced an- mailed getting possible are gradually is forms their are process survey that (and Traditionally, that results surveys but produce design. in to the order questions improving in for not the considerations relevant survey, expe- careful is under evaluator user a developed the rate filling be Because should to is swers) is user features. surveys the system of when some purpose regarding present a main preferences about The user opinions GUI). user and its about rience data (including quantitative service collecting or for product suited is method This methods, gathers other Surveys: efficiently to end-users. and Compared of assumptions, number user [88]. sizeable of a it’s understanding from behavior information but quick not user a design, should researching provides a groups for technique focus of source this Therefore, The usability data GUI. the preferences. only a and of the needs of assessment be user aspects the of some not identification concerning is the meeting issues groups rather two-hour discuss focus a to of opin- moderate users purpose and would nine proper feelings evaluator to people’s the six Generally, understanding about for design. between method certain informal a about an ions is the This having groups: performing by Focus are directly they as occur users can the recording observation video tasks. The by their indirectly, product or the site, scratch. of on release from present future the evaluators one a words, improve creating other help of In time, the instead contextual release. of a product most to would, after collected contrary orchestrated However, information usually is questions. observation them their field inquiry. ask in inquiry, also contextual system a may than on and formal working they’re environment, less as natural users, and participating structured the observe [87]. less would study evaluators is research The method term long This a up observation: as for intended Field last usually it’s can since and more, development or or in physical, year on technical, a early to unusual defin- conducted when in it’s appropriate Moreover, practices is work environments. inquiry improving social contextual for of be- requirements use realistic The his usability more laboratory. from ing and usability accurate arise more a that is of collected questions, questions that data than asked standardized the Therefore, some and environment. environment presented that data regular is in havior his interviewee obtain in the to observed First, aiming he’s method use. then interviewing of semi-structured context a the about is needed. This if conversation further inquiry: the explored be Contextual so to questions, issue particular open a a is some toward interview guided with semi-structured be questions a can Finally, prepared combines [86]. collected bias that the desirability extension, method social by and from user conversation, the less natural and suffers more evaluator a data the to between leads relationship which the balanced formal, more standardized is less of is composed method not this between is As or interview questions. groups unstructured user an However, various aggregated, within correctly periods. results time be interviews different can of information comparison the collected in enables the questions also that same and the guarantees structured asked This a is order. In user each same unstructured. and the or advance structured in either prepared are be its questions can and the user system interview, the the about with questions discussion of The series a issues. user a asks evaluator The interviews: Individual Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin4 128 / 45 Inspection Usability Heuristic-based of Automation to development in following: the on are later methods or inspection usability prototypes, Among produced system. a the of of usability overall quality early out the end-users the carried assess actual be verify of a of can to inspections presence in rules Usability development the problems of perform. require in usability to group not cheaper possible do a considerably methods them identify uses makes these to evaluator which testing, order an usability in to where Contrary heuristics methods GUI. or of guidelines, group standards, a thumb, to refer inspections Usability Evaluation Usability for data Method 3.3. of Inspection The sources only the be not should methods about usability. studies Generally, inquiry improving than and assessment Therefore, [89]. evaluating usability needs) for what and reliable [90]. less on wants performance are than real system rather user their a do, about comprehend users opinions what always user on not of more studies do focus users to important (because it’s say GUI, users better a design to order In Testing Software of Fundamentals 3. Chapter • • • ohv utpeeautr odc ersi vlain oietf h ihs ubrof number preferable highest it’s the Therefore, identify to find. evaluations to heuristic difficult conduct he most evaluators would the multiple nor have the are problems, even from to that usability that all Judging issues assume identify the to to evaluator. uncover enough safe necessarily an it’s be not squares, by would those evaluator discovered of skilled Every issue distribution most visual an the [97]. 19 represents in difficulty which non-overlap figure detection during the the study, varying in case with square system problems banking Fig- black usability a analysis methods. 16 of informal other evaluation identified to The heuristic evaluators compared a principles. quickly represents and usability 3.2 cheaply of ure issues set usability accepted identifying indepen- an in GUI against helps a it examine evaluators check more and or dently one having involves This evaluation: Heuristic evaluator analysis, task [96]. proper considerations in re- context difficulties as and weaknesses such training, methodological evaluation cor- some the next for them, and preparing the of techniques when toward of all main user variations in a however, the into extensions, guides time many GUI over developed evolved the examine, have to bad) walkthrough has Cognitive how evaluator step. (or the task, rect well a each how GUI, of completed document step a each and to At problem judge, applied mock-up. user’s be paper the a can even This simulates or himself. expert prototype, tasks usability performing the while process method, solving this the In in trustworthiness walkthrough: or Cognitive services, online health and medical e-commerce. attributes of of quality case case other the guidelines on in web more privacy context-specific focus as might contrast, such and By are differences guid- significant [95]. websites, level for some interfaces on standard contain in user might 9241-151 review web vary ISO wide guideline and the world a and use on usability, performing ance of homepage for for contexts references guidelines different source 113 general several with Nielsen’s UI Some applications, deal the web They quality. to and of comes exist. [93], it reviewing guidelines When for interface iOS [94]. material the user Mobile are Android Windows instances the for Some dif- guidelines [92], manufacturers. several for guidelines by have guidelines interface proposed well experience mainly human as user are applications online that the Mobile [26], guidelines there’s rules) ferent particular, 1000 [91]. to in applications close Microsoft desktop include it’s of Windows-based (that general, case guidelines in the Mosier applications and in (WIMP) Smith or Pointers the on and over developed rely Menus, were to Icons, guidelines possible Windows, accepted con- many For for such, GUI years. As a the guidelines. check would of experts list usability a more with or formance one method, this In review: Guideline Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. etitaesi 05ue ersi vlain hl n20 n 02 ny1%o hmused of them winners of ten 10% top only 2002, the and of 2001 half in while Furthermore, evaluation, heuristic inception. are used evaluations their 2005 heuristic in after that intranets found long they best practice [101], Novick in David used and widely Hollingsed Tasha by study a In uoaino ersi-ae sblt npcin4 128 / 46 Inspection Usability Heuristic-based of Automation 3.2: Figure Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter • • edsigihbe hsapoc nue htacmaywudhv nuhconsistency enough have [100]. would should products different company software be a multiple to that its Nat- need ensures between task). that approach a layout), This aspects in and steps thus, distinguishable. color, of exaggerated, be order icons, be the as should or (such names, inspection command graphics this examine as the urally, (such tests font), interaction consistency the the The of and aspects spelling, and screens. the multiple as between (such or text GUI the screen a of single consistency a the within evaluate evaluators design method, this Through inspection: Consistency tasks. critical highly most inspection In in the the features in prioritizing [99]. developed on needed complicated properly emphasizes features too it of of or Therefore, importance unnatural usability. the long of is on improvement too process overall focuses require the the approach tasks in this step the sense, one whether general if checks a or also sequence, and technique usefulness, feature This its a understandability, are of user. its perform on the Then, based would for scenario. evaluated user availability task is its that feature each each performing tasks step, for the second needed method, a features in this of of series the step including first identified, the In context. inspection: general Feature a in heuristics apply redundant than rarely because less that is those significantly This alongside review. are note guideline eliminated evaluation to a are heuristic in important checked a also principles in It’s of examined number features. the principles new of or number revisions the design developers that prioritising and also in designers and them for usability, suggestions assist of improvement that measurement by valuable quantitative rating and of severity insight way a useful cheap attributed delivers is a provides problem the each This Typically, and evaluator. [98]. aggregated an development are iterative in evaluations in earlier the advantage fixed of valuable be results a would have is problems problems which major usability issues, because major minor designs render evaluations, than might probability heuristic evaluators discovery in five higher evaluator Moreover, than a one more expensive. three only using too about while using cost on issues, because the all relying outcome, uncover Consequently, beneficial not most probably [97]. would the evaluators per- delivers investment of evaluators on number return five the the to how to shows relation that curve in graphical forms a holds 3.3 Figure violations. eeeetto fuaiiypolm icvrdb vlaosi ersi evalua- heuristic [97]) a (see in study evaluators case by tion discovered problems usability of Reperesentation Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin4 128 / 47 Inspection Usability Heuristic-based of Automation modeling analytical through evaluation usability for techniques perfor- user’s following: Some the the representing are [105]. on GUI analysing also on the but only knowledge, or not required mance, focus the user’s can from the models suites or The test environment, criteria. of task test variety the different a selecting generate usability simply assess to by that possible model cases it’s the same test more, of What’s of model hundreds behavior. abstract writing user an manually predicting of write by that instead instance, of inter- once, for use just the would, test through evaluators and under predictions methods, system user usability of the generate category effortlessly of this to In model able model. representational be a would on evaluator An based face. are modeling analytical of Methods Method Modeling Analytical The in usage its of peak less the used experienced already being [101]. inspection less mid-1990s are usability the the walkthrough formal with and usability Continuing time, pluralistic inquiry with [104]. and frequently with [103] cognitive 2017 combined methods and methods, testing reviews, 2016 inspection usability in guidelines popular and intranet style interviews, best reviews, as of such winners process methods the reviews, note, concept contemporary on more a focused On [102]. method this 3.3: Figure Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter • a upr utpemtosallaigt h aega.GM nlsscndetect can analysis GOMS goal. interface same each the while operators, to or leading steps all of achieving group methods for a multiple suited of better support composed is the is may meeting method method for which infor- Each methods deciding audio goals. operators, for and the criteria goals, visual of selection his group some and a and memory, introduces goals, term also long It struc- and cognitive processing. term user’s mation short the his describes spe- perception, model a his psychological This is ture, through interaction. model derived human-computer processor GOMS on information A research human analysis: the of (GOMS) representation Selections cialized and Methods, Operators, Goals, uv hwn h euno neteti eaint h ubro vlaosi a in evaluators of number the to relation [97]) in (see evaluation investment heuristic on return the showing Curve Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. xmn ifrn eindrcin.Sm oceedsg ehd r h following: to the parameters interacting are various user methods under design a run usability concrete be imitate the Some can to Simulators directions. evaluations, aiming design findings. of interfaces different his examine back sorts and reports users these and of interface In models an with of methods. use testing make techniques would usability Simulation evaluator with with reported. are interaction combined issues user usability be some some simulate might interaction, to this on models based interface Then, and GUI. user the on relies method simulation The Method Simulation The uoaino ersi-ae sblt npcin4 128 / 48 Inspection Usability Heuristic-based of Automation Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter • • • • • npeiu sg aa tcnsmlt h uueitrcinbtenaue n U,and GUI, a and user a based problems. between When design interaction potential future modeling. detecting the performance in simulate user assist can for it used data, be usage previous can on method This modeling: navi- strongly net user Petri can mimic to This aim models outcome. issues. scent usability expected Information discover their the and [107]. compare gation with trust they got and then, engagement they and user path, information affect navigation useful the of specific of amount a the on value estimate get actual humans to instance, likely For are GUI. they a information of pos- it’s usability information, the seek improve users to how sible comprehending better By behavior modeling: scent user Information mimic to used be GUI. can a models with processing interacting and when Information retrieved, stored, human. manipulated, a gathered, by is classified information development how cognitive in regarding research psychology on based and is model This modeling: processing Information knowledge of extent the assessing for used be the designs. multiple can course External between the which of transfer is (ETIT) method and analysis Mapping task-environment consistency, Task a the of Internal example learnability, An these provides GUI. the how the associations about of these and functionality representing information environment), and his with Analysing (in evaluator GUI. user the the in the mapped by are completed tasks between originally same relationship were the tasks examining for the applied how is method This analysis: Task-environment Task-Knowledge the Critical and the (CFM), (ACTA), [106]. Model Analysis Some (TKS) Task Function Structures Cognitive Cognitive system. the Applied complex a (CDM), the a with Method are of associated Decision methods GUI workload the the mental of for instances requirements the of information assessing differ- analysing in performance in perceiving users, or in expert end- design, evaluators and the help new from can This between needed activities ences tasks. cognitive complex the perform all to that modelize, user tasks task and understanding Cognitive of examine judgement. goal methods and the analysis memory, has problem-solving, analysis decision-making, task rigorous of need type This analysis: task [71]. Cognitive and rules details, selection in and tasks methods, analyse operators, to goals, required the is all it view, aspects. determine goal-oriented linguistic a or takes aesthetic method to the issues. this related usability words, Because issues possible usability other about identify predictions In cannot generate model to model. the model However, the GOMS in a on behavior rely user can evaluator predicting by problems usability potential Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin4 128 / 49 Inspection Usability Heuristic-based of Automation section in discussed [79] briefly following are the are benefits testing these usability automated of to why : Some specific reasons [108] more the are must-have. that explain advantages a first Other to is 3.1.7. that important field it’s automation this types, of evaluation in degree different the automation the and completely in automated Evaluation [90]. sometimes introduced Usability be or scenarios be can partially can task aspects 3.3. be perform which can examining to methods Before GUI evaluation automated. a usability with of interact specialists majority more they the or ac- However, as one most where users tests the representative usability undeniable, from observe is produced directly are research results usability efficient for most the end-users and actual curate of importance the though Evaluation Even Usability Automating for Need The 3.3.4 Testing Software of Fundamentals 3. Chapter • • • • • • h rvlbde n h otfrrcutn participants. on of recruiting up number for saves cost large also the This a and evaluations. enable budget usability travel to in the software participate to on users rely separated can geographically Evaluators evaluation: remote Enabling and automation, re- for appropriate manually. predictions very performed are and be methods not insights Such should GUI. valuable a provide with interaction methods user through garding evaluations simulation Usability or evaluation: manual modeling for analytical complex too are fix. that to tasks issues Performing usability results, which validation of cross prioritization the the Examining regarding decisions validated. simplifies re- cross evaluation effectively also a the and alongside a then cheaply developed, tests on be sufficiently these can relying is sults performing GUI by the then Therefore, when stage, test early usability resources. non-automated an and in budget evaluation additional development automated introduces the repeatable this drain However, might other. and mul- each combine costs complement to that better it’s methods Consequently, evaluation usability. improve tiple to guarantee needed always data cannot the evaluation of all type acquiring single one of Performing them. set expert validation: prevent fixed cross even domain a Simplify or against a on GUI early of a issues knowledge checking detect automatically the to By possible by it’s inspection). conditions, or standards walkthrough), a the in cognitive of as experience a (such and in skills the as by (such influenced evaluator strongly usability are Some instance, knowledge: for specialized methods, and inspection expertise human on dependency the Decreasing inter- structured a conducting directly and of before instead surveys test of view. usability inclusion remote the automated unmoderated automating instance, an for or be, after logging, can manual activities of Such the instead reduces cost. logging activities overall event of the types extension these by Automating por- and simple. time, considerable and work a repeatable methods, is evaluation process the usability of most tion For cost: evaluation the in Decreasing on early start after and implemented. occur been process only have design measurements GUI process, iterative the performance comparison the non-automated the in comparison, greatly integrated By simplifies be development. effortlessly This can be ana- this can needed. automated since as predictions be especially designs usability would many models, this as on to for relying solution effort, generated By A time, methods. 2.2.3). in section modeling cost in lytical high metrics. discussed a same (as the demands resources using formally accu- and and performance an conditions same usability requires the designs measuring under However, two alternatives Comparing both of designs: assessment comparing rate of process the Simplifying Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. hrfr,anwpsil aeoyi rpsdi hsppr n ti ecie ntefollowing: the in described is it and paper, this taxonomy. previous in by proposed covered not is was category that possible aspect automation new new but a a reached research, Therefore, it previous where for point a abstraction to enough progressed provided has have might classes four These and evaluation [79]: usability categories between four compatibility into the methods arranges classifying automation, for used taxonomy the Generally, Evaluation Usability Automating Toward 3.3.5 to important it’s Therefore, automation. the degree. of reduce compatibility concept their to understand the evaluation is and usability to examine goal distinct differently Its the themselves However, partial evaluator. through lend process. him evaluation the methods supporting the by or evaluator of user the computerization on the complete burden or the replace lessen to to and not cost, evaluation is overall automation of purpose The uoaino ersi-ae sblt npcin5 128 / 50 Inspection Usability Heuristic-based of Automation Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter • • • • • • rmteeautr hs tcssmr h ogri at.Atmto a oeie be sometimes can Automation effort lasts. and it phase. time longer this the demands during the on evaluator planning more the view of assist costs minimalistic to sort it used a this thus, with Consequently, evaluator, test the tasks. pilot from and Some- a survey. users incorporate a of in might include number phase to would questions planning that which the the participants choosing even the describing times, or selecting from group, to user evaluation, range a an can represent during They best users by the requirements. performed be on project to depending specified tasks differ the some can and make activities used to These needs method activities. evaluator various the performing evaluation, by an preparations performing Before planning: Automated challenging most the is and analysis, issues automatic usability achieve. successful the to a to implies improvements process or This solutions detected. suggests methods This and detect critic: complexity. automatically Automated in to capture able automatic exceeds are it that Naturally, methods issues. to usability refers locate class This completion. analysis: task Automated for needed time presses, tools keyboard the session be and software can interaction paths, information navigation on captured an movements, of depend Instances with mouse system. can associated the and data that user usability-relevant the methods between recording to and refers capturing category for This interaction capture: be the Automated logs can tool activity software a the if interference. in evaluator’s case make the the operation not without be does no details would computer it as goes However, a long same automated. on the process observations as evaluation the allowed, user logging is instance, the devices For recording assist video computerised. that or technologies audio of other use for supervised The while manually experts. performed be human should by that methods to refers This Non-automatic: budget the draining without checked be can time. iteration wasting each itera- or of support usability would the evaluations since automated design, of tive forms effortlessly some tasks on relying makes Therefore, automation Naturally, repeatable. design: iterative simplifying and Supporting Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin5 128 / 51 Inspection Usability Heuristic-based of Automation necessar- not does be approach can metric-based analysis The the pattern-matching. instance, on For or files. tasks, log on the metrics, analysing on by based performed be to can needed analysis interpretation. is Automated and this measurements on state user research mental attract more of that However, accuracy elements [110]. the design engagement increase the emotional understanding his in im- trigger help the or can analysing attention points, and meth- focus response other visual emotional by user’s his researched the of emotion be measuring plication cannot and automatically that software words, manner tracking other a eye In evalua- in on ods. an measured relying be of while can capabilities user usability the the APIs, exceed recording recognition that video results by collect instance, can For the capture tor. to automated according the advance cases, tasks. in some designated arranged In its and and interaction prepared recognition, the is speech of classes nature and the observation and datalogging notes, played. context of of detailed being range forms such video large advanced producing recorded a to for the where linked details shows possibly automation center is the and process about left automation hints the times- corresponding on no their there’s area of to though the side mapped Even right while are The video, they 3.4. the how figure and in in made tamps feedback screenshot observations the additional the to give presents similar notes figure relevant to fashion with the helpful asked clients very to also a delivered instructed are and in first examined observations and are are and videos test, Participants the the Afterwards, their [109]. before test. the to tasks method following delivered performing aloud far users thinking specialize so of the [109] and "UserTesting" videos about applications, as million than mobile such one and sample services over websites user online clients of larger Some testing much remote laboratory. a unmoderated a in from in tested information the be gathers than can testing important what remote more in are task. capture messages a files, Automated error during relevance moved log prompted their has such of to mouse of user’s according number a nature them the that voluminous classify distance instance and the For automatically to be data Due usability. logged can to the data issues. filer This usability can identifying made. button tools for errors some as basis and such a paths, information is relevant as navigational task record needed, used each easily time participant’s While can movements, the automatically. tool mouse activity of software presses, user recording usability log a to video efficient performed, observing and a more being directly it’s analysis, watching GUI, while log repeatedly the notes testing, with while take remote interaction or evaluator for behavior, the suitable participant’s having most of the is Instead use other its In measurement. capture, performance extent. automated evaluators certain to the a delegated by to activities analysed and Starting responsibilities and the captured (both on automatically class focuses be automation "testing" can the between the words, system definition interaction of the the by instead from and would produced class user user, information "simulation" the relevant However, the the end-user, 3.3.3). mimics into section the evaluation that in of the described program involvement of the computer type is a it the with and change user automated, be the cannot replacing method because testing every in role One Method Testing the Automating research in history mobile longest and the Evaluation have Usability web, they WIMP, as 3.3. to interfaces restricted WIMP practice. on is on in lying discussion depending and focus vary the main simplicity, might the For feasibility overall with It’s even complexity. evaluated. GUIs, the or reduced UI for difficulty a of automation and and type the accuracy time, the better that a a note at with to time, aspect important less one also in on achieved usability be concentrated to allows and goals suggestions, organized evaluation improvement be and to analysis, efforts data evaluation capture, data planning, Separating Testing Software of Fundamentals 3. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin5 128 / 52 Inspection mea- Usability be Heuristic-based can of models Automation By evaluation different usability. various of have of critique performance This automated the that and learning, [113]. systems analysis machine experience on constructing automated through relying improving and task further algorithms a for studying solving potential on in great focuses testing improve It usability progressively in learning. and automation machine automatically the of improve use might the implementation is whose more concept text additional message challenging. An automated is error accurate files However, the log errors. make on the only to causing based fields instance, critique specific Consequently, the for error feasible). highlight that be, is visually then activity and would concise, time, the improvement that each usability course, message of error possible (provided, same a same helpful the the not prompt make probably and repeatedly is form, users message a if submit example, they For classes, while in improvements. mistake arranged usability be possible could issues to usability mapped then further, and analysis usability pattern [79]. and of task issues result take these programs the locate If generally would are analysis pattern-matching patterns a These therefore, being error. and or problems, commands, same same the the user with calling captured consecutively confronted the as continuously such analyses behavior, automatically user repetitive approach detect or pattern-matching to logs, inefficient the [79]. of on participants problems patterns based usability the discover tools from by can Software suffer generated that tools activities files software the analysis, log pinpoint this and the behaviors, Through with incorrect compared performs test. designer and usability the examined the when are produced in files GUI, log the the perspec- words, designer’s on other the events In with completion. experience task interactive normal user’s on tive the be compares can approach that task-based form The a in [112]. [111] interactions task complexity complexity, user perceived behavioral complexity, the and cognitive complexity, measures from program automatically produced a it is then files analysed, (AMME) log further Evaluator recorded Model Mental the Automatic transforms instance, solving pres- that For with problem associated issues. the is usability hinders complexity of difficulty interaction higher ence high task associate Therefore, high then learnability. usability a and decreases some and because tasks data, process the quality such of usability on complexity low Based behavioral with test. or complexity the cognitive vari- during the and areas analyse frequencies, GUI might call different event tools to task, attention certain the user a made, in perform aggre- errors to ations analyse, of required to number operations used the physical be times, of can completion number Automation rates, completion 2.2.3. measure section statistically in and discussed gate, metrics same the on rely ily 3.4: Figure Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter ata nphto srsml etfo UeTsig eiee ihattached with delivered "UserTesting" from [109]) test (see sample observations and user notes relevent a of snapshot Partial Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin5 128 / 53 Inspection Usability Heuristic-based of Automation and [8]: preparation following the the in are as planning bad such from phase, data resulting planning gathered problems the the the of of in Some evaluators quality questions. the and the value by of the put design However of effort deal the people. great on of a depend user categorize number and large the collect a as rapidly from such can information methods questions inquiry open in have capture not automated default). Basically, do by that capture automated methods supports inquiry (which written other the method analysis, of feedback benefits meaning sentiment and also applying tone This the by quantify and problem very text. derive in this to questions linguistics solve on computational open tools classify or to based mining, software reply and text statistics users Some However, aggregate relevant [79]. forms. produce examine, choices textual of automatically to different set to difficult limited a simple are to it’s belonging questions words, answers open fixed other a of In with answers questions automatically. close-ended the to answers, Compared of questions. profits. open their set with maximize associated unethically is to bots issue that internet Another websites intelligent confronted instance, artificially constantly consid- For on are A rely incentives, who answers. monetary results. users participants honest collected with offer provide the and surveys, necessarily of online validity not in the will specialize corrupt users also of might motivation portion par- this the into erable but increase to forced significantly participate, choose feel might to voluntarily who prize) users would users a users of of (or the those incentive survey than if an the out using Then, Additionally, thought present [79]. ticipating. more initially pop-up be to a would of preferred de- answers instead it’s course, their command) be problem, participate, of a that which should (or avoid them, evaluation link to irritate a inquiry order might as an times In in multiple usability. users participate the asking to creases because encouraged possible, users. are as of low mind users as the which in at clear they be after frequency still immediately The would answer experience to interactive The users the tasks. asks because main software tasks, their occasionally the their of when finish can some accurate users conducting more finish and are they GUI, chosen after answers a participate to within suggested displayed or be prompted be can surveys preparation. or automated questionnaires in degree Generally, lesser successful even a very to and seem and issues, inquiries capture, automation usability data However, in locate improvements. and suggest data non-automated to their analyse challenging as more automatically [79]. limitations to usability same improve challenging the help it’s existing from to counterparts, identifying suffer used than evaluation then so inquiry be inquiry more automated would of opinions, Therefore, information goal and gathered main preferences The the user Naturally, problems. about usability answer. data subjectivity ques- collect to embedded to of prompted is use be methods the would through user possible the is evaluations that usability tions of type this of automation The web-based Method Inquiry for the Automating design a of usability importance overall the the ranking to by analysing Evaluation relation [116]. Usability for needed, systems in information learning most 3.3. factors machine the determinative use are of usability to improvements composed work of possible usability being the even set which it’s by training prioritizing Moreover, shown their and as [115]. despite pages or data Rezabut usability web systems, semi-random Hassan testing eLearning between and of for as navigation Kim suitable (such Eunjin and be Korvald, use organization also Christoffer of the can context to techniques specific related learning a issues Machine for one [114]. suitable sites) most eCommerce the find to sured Testing Software of Fundamentals 3. Chapter • aigunee usin htd o ev h ups fipoigftr designs. future improving of purpose the serve not do that questions unneeded Having Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. ecekd hrfr,teeautrpasarl htcno erpae n uoaighis automating and replaced cost-effective). be principles more particular cannot and some to accurate with that need more compliance sometimes role that checking is (because a principles manually partially the plays done be evaluator of only nature the should work the us- Therefore, guideline-based from of depending checked. automation complexity be currently, in that, vary assume inspections to ability possible it’s modeling. background, is analytical this it as importantly, From such more types evaluation and usability complex, other more for is aesthetic but suited automatically [120], better computational design images Some GUI photographic a evaluate evaluating subjectively operationalize. to aesthetically the Checking developed to On been design. challenging have pleasant methods is events. aesthetically a judgement or principle an automate commands having this to is initiated effective with user’s principle is compliance considered to It short second responds having [119]. the system be hand, second) the other would one quickly than principle how less usability checking or first test equal The (ideally time the considered. response with be system can associated examples complexity different and Two difficulty the from directly conditions. tested depends checking for principle automation of specific feasibility one The evaluations. perspective-based when and inspections, applied inspections, consistency and feature inspections, combined standard be inspections, can heuristic usability analysis reviews, automated no guideline performing and is capture there automated degree, (but certain documentation a yet). To the function in this with put cognitive market effort log- record the the facilitate video on to minimize tool to commands possible to voice it’s and and alternative, observations software an recognition ging As it speech use found [96]. and evaluators cognitive energy sessions, as and a walkthrough intuitive time throughout counter of answer itself terms progressively proved in they approach costly that that this questions forms However, guiding with [118]. evaluators and providing walkthrough in, by fill capture to automated have in they assisting tried tools software Some Method Inspection the Automating Naturally, measurements. detect activities. previous to similar of database with order large tasks in a other satisfaction on than difficulty level based difficult when actual task more better the for works way applied this if is be present task can particular is them one same problem asking when compare The A then to and possible expectation. problems. it’s user be, task, detect exceeds to the automatically task task perform and and they a measures expectations, after expect immediately user both they difficulty quantify difficult actual how to the about users used again asking be By can data satisfaction. [117]. gathered participants level wanted the of analysis, recruitment automated the reduction and With the questionnaire in a leads of which cost in- evaluations, for design performed groundwork the previously the of from up data laying on by based issues methods these avoiding quiry in help can algorithms learning Machine uoaino ersi-ae sblt npcin5 128 / 54 Inspection Usability Heuristic-based of Automation Evaluation Usability 3.3. Testing Software of Fundamentals 3. Chapter • • • • • aigtesre o long. too survey the Making participants. from knowledge prior some assuming Wrongly questions. between options rating demographic. Inconsistent or group user incorrect an from participants informa- on quantifying Relying or classifying in help not do tion. that options, answer inadequate Having Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin5 128 / 55 Inspection Usability Heuristic-based of Automation that biases common from judgement. removed human decisions evaluator’s Models make an developed. and affect is information might knowledge of or quantities expertise large particular collect that in can representing automatable model become a possibly may when modeling. automated, future analytical be the in evaluated cannot automation currently of be limits that potential can tasks the evaluator GUI pinpoint Usability to [128]. a difficult (CAMPUS) it’s of Satisfaction mind, in User aesthetics this Perceived With the of Model example, Cognitive-Affective For a represented of be use analysed. can the automatically GUI, through a and with model interaction a [79]. human in problems affecting usability detect factors effectively analysis psychological accurate to GOMS complex applications some automated WIMP Even instance, of produce types For can all GUI. on evaluator the used the improving be can in 3.3.3, designers Based section assist analysis. in that automated predictions explained especially methods and the automation, support on methods modeling analytical All Method is Modeling it Analytical Thus, the Automating website. a this of However, check. files can [17]. (CSS) it heuristics Sheet heuristics of of Style type number Cascading the same and when in exact text limited 95.86% the the to inspecting checks increases are only result experts framework the the usability Besides, and the by evaluation. tool identified heuristic the manually same for violations usability the framework the conducting studies, usability of frame- experts these 51.48% a this detects with of building it experiments One that in Conducting showed [17]. resulted work "USEFul" [17]. Dingli, called [127] websites Alexiei of studies evaluation Heuristic-based and automated past Mifsud thumb. in of Justin success rules by usability great performed of shown set have generic analysis non-redundant fields automated a input on in based values are default Heuristics appropriate of ratio. a as contrast performing number such color for the aspects text screen instance, proper with the and for deal on travelled metrics check, distance Widget-related metrics mouse Screen-related average task. the and task. number elements a the GUI or complete health scrollable steps to of mobile number inputs evaluating the needed when as abstract such success of the aspects great include Despite metrics showed Task-related classes [126]. principles. exact applications such widget these groups and categorization, into guidelines of principles the level arrange screen-related to be principles, would task-related this process simplifying evaluation as the of automating way review, One guideline a beneficial. in very checked is guidelines [79]. of of solutions size aspects large suggest the checking not to cover do Due and just terminology neverthe- together the still, they and but since layout GUI presented improvements visual a Windowsthe for and of for room efficiency view tools significant the developer’s analysis analysing leave automated the for less, This two support between are valuable gap [125] provided them. Sherlock that the evaluating used interfaces and bridge and be [124] AIDE can can objects [123]. and method interface guidelines Ergoval a user called platforms, to Windows is according on method rules application ergonomic WIMP structuring of for inspection an perform compliance with Evaluation for To compliance Usability check can website [122]. that 3.3. guidelines for service accessibility portal checks content web tool a web is with SEO Says" Varvy "Cynthia the and check [121], instance, automatically guidelines can Google For many On as tools, violations. limited. such relatively guideline from is most for guidelines the of benefit applications list GUI web complete of hand, a WIMP alignment other for of the the support as GUI but such symmetry, The aspects their visual inspections. and examine elements their to in checked evaluators automatically be the can support applications services and tools software Many Testing Software of Fundamentals 3. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin5 128 / 56 these of designers Inspection 11% of Usability about Heuristic-based 42% respect of about not Automation do that designers minutes, that 15 problem, is their guideline to showed relevant a studies guidelines for find Additional time to search fail [131]. manual efficiency a aes- average their prefer encounter the to of often that tend regardless they Designers that interfaces demonstrated the compelling problems. study in One thetically usability role guidelines. new design a introducing following time, play when avoid saves challenge to which designers work users GUI their need in help evaluators not or support do can methods inspections Automated inspection process. methods, evaluation [85]. inquiry users five and with testing testing still Unlike directly but textual of cheaper, of accuracy much form high are cost-effective in methods the presented dis- testing replace easily automated be cannot be Meanwhile, can can reports. results alternative, statistical their paper graphical and traditional or demographic, [79]. evaluation the larger direct much than as a accurate cheaper to as much be tributed inquiry not are and might but inspection, methods effort, testing, inquiry and as time Automated on such be saves classes and can evaluation costs the other benefit model on of cuts the time automation and hand, of other repeated period the effortlessly On long be a can over tests [130]. last the inexpensively adjusted as that continuously models, projects using work software initial starting from However, overall for the significantly skills increases which required development. high, and very of effort these is The cost categories, in evaluation approach time). two these testing much from manual too methods take a with and and evaluation (because difficult method automation, Some be concrete support would the inherently cases automate. matter simulation no to and automated process modeling be evaluation analytical should usability particular in which and determine classes to simple not It’s Process Automation new suitable users, Choosing human to enough close is emerge. AI draw might interface once engineering to an and difficult usability with needed on is interaction perspectives is it and their but area possibilities where potential this of point in lot a research a reach More have bots bots effectiveness. Intelli- AI their Artificial user. about of human use conclusions the the concrete is mimic upon, to expanded bots be to (AI) can dynamically gence still changed which method ad- be automation big can additional a variables An [129]. is its usability analysis that on because automated means impact decisions, This the which design effective observe customizable, increase to. and iteratively the logical is links many that tool it make means to UESim one arrow, needs the the negative that of a team design decrease hand, variable a the the one for On if vantage to that leads to. indicates variable links arrow one it positive of variable A are the relationships arrows. will the negative so where as diagram increases, or loop positive with causal associated as a concepts either in The However, represented displayed [129]. are 3.3.3). UESim UESim is section model evaluation in simulation usability discussed the a devel- are simulate software to which to model of related closest (some the issues solving different problem detect cognitive to and [79]. created process opment been solving problem have simu- the models a analyse GUI, simulation then construct Various a and to data be with usage would interacting previous example from when An directly behavior model analysis. lation user automated mimicking support inherently on methods based Evaluation these Usability are methods 3.3. simulation Since Method Simulation the cost Automating initial high relatively the elements. as graphical well the as or tasks parameters), different and the might variables modeling use on their for rich that are complexity and they difficulty (as the introduce are analysis, model to downsides some However, Testing Software of Fundamentals 3. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin5 128 / 57 Inspection Usability Heuristic-based of Automation the with associated problems realistic a that with means task This assessed. the accurately manner. performing be spontaneous cannot user focus a a user’s by in encountered concentration However, For be of hand. environment. at only level laboratory task can the a to problems in team attention usability simulated full and their some be dedicate nature, cannot usually use observation in under of budget, users context example, in the differ of aspects all some they Finally, as perfect the project selecting certain of for overanalysis a way an for clear experience. avoid methods no there’s help evaluation However, would of budget. meth- methods set evaluation design these of the Choosing issues group wasting cost-effective usability and other. and usability, of appropriate each percentage an complement the select that identify then ods first detected, be to to essential needs it’s revisions that Therefore, design prioritizing though and extensions. results Even deciding between feature when inconsistencies cultures. or uncertainty small and and the complexity use, similarities, more of undeniable introduce contexts share might groups, still course, user of ratings, users, produce severity results, different methods different the for evaluation them applies different give same further, and the issues this user.and different explain find To single issues, evaluators usability severity. one different of their on results, discovery different rating evaluator in-depth for single and test- as one accurate software well for by regular as needed once need to is the just Contrary methods evaluation also performed methods. in or be Diversity other evaluation cannot from one testing those from usability with results ing, results aggregating of for validation need cross the for to related is websites scrolling. issue successful excessive most Another requires the that of design one a is web- However, having which Therefore, usability. despite Facebook their history, scrolling. be improve in would like to example not order An in do scrolling exist. users minimized do that that exceptions is way for a nineties guideline in late usability designed major the were a since sites nature instance, known For the was and considered. that be exceptions, web to con- some the needs the have application of can the regardless rule of design Each context a usability. and to increase guidelines automatically all not principles applying will Usability blindly text guidelines. but with important, compliance are of extent heuristics) all the (or competition to refer the not and does size, usability Similarly, accordingly. its strategy needs, business market opin- the analyses and adjust research preferences to market design order hand, user in should collect other methods, methods inquiry the of Inquiry On those research. ions. particular, market in with and confused evaluations, usability be of not results poorly, the performs all, evaluation. it of an aspects First which conduct in properly or Under- to on, how exception. focus comprehending not no for does is important practice evaluation is this usability issues and of sort limits, what practical standing its has effort engineering Every Evaluation Usability of Evaluation Limits Usability 3.3. 3.3.6 that tool a or framework a if automa- exists. especially Therefore, already considered, process way. be the significant always of supports a should evaluation in inspections tests automated usability suggest manual even of An required and tion problems of detect [123]. amount interface, the them an cuts analyse of improvements, can 30% that guidelines interpret and principles incorrectly common they that and guidelines Testing Software of Fundamentals 3. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin5 128 / 58 Inspection Usability Heuristic-based of performed Automation being is task same assess the reasons). Nevertheless, would different for ones [65]. still latter it’s the (but [8] the where case function while GUI, this said correctly, in in same perform performed twice that the to is means is throughout function This it performed the easy be [129]. if how might check tasks [8] would user functionality tests same provided first the exact each determining the concerning on fields, use focus methods of while both simulation requirements, ease and the the methods in graphical of testing specified software’s usability degree as the the functionalities if hand, the other checks access the mainly to on testing user the GUI allows words, 3.3.3.1 truly other section interface in In described methods testing respectively). evaluation the two 3.3.3.5 particular, are and in the (which of and method evaluation, portion simulation a usability the 3.1.5.1), for and in method designed details those further with a in as overlap and performed 3.1.4 performed being section tasks is in testing discussed GUI is when (which instance, test For black-box some resemblances. different, some are evaluation share usability methods and their testing of GUI of objectives cannot the and though developed. code, even sufficiently Furthermore, source not the is of GUI knowledge any the the without on if focus performed started their be be is can evaluation forms and testing Both GUI GUI. between shared aspect apparent Evaluation most The Usability and Testing GUI between Similarities 4.1.1 also and industry, tools software the their of by of methods overlooked. favoured scopes which being being are comprehending the are methods for of evaluation which help when understanding usability would basic differences and This a and testing market. have GUI similarities the to on their needed sections currently examine also in tools, It’s to respective details other. important in each described still with been it’s compared both respectively, had 3.3 evaluation and usability 3.2 and testing GUI Evaluation though Usability Even and Testing GUI between Comparison 4.1 possible test- suggest GUI to order certain evaluation. in of usability studied, impact of be automation the would the Finally, evaluation in improvements tools. usability available on methods currently techniques which and the automation support, ing by engi- tool of much best usability overview the as and an have methods assisted have testers testing the aren’t to GUI which and needed assisting understand to similar, also for order it’s are commercialized in Moreover, neers, successfully fields and differences. two would designed their these testing, tools of which GUI the extent at as the extent such for the field bene- goes different can comprehend same evaluation, completely thoroughly usability a to as for first such designed require field, techniques testing testing particular from one fit which in degree the Studying Sequencing Event GUI with Inspection Heuristic-based Automating Usability 4 Inspection Heuristic-based Automating 4. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin5 128 / 59 Inspection Usability Heuristic-based of Automation during areas GUI relevant to users of the eyes that the ensure guides to aims and alignments, assists alignment tasks. checking design the the also the verify inspection well would how usability test measure a GUI such to Meanwhile, graphical a doing correct. of such for aesthetically motivation that alignment is the means it’s the design performed, This inspection, being in usability are different. consistency a tests is and the same tests aesthetics, the verifying if on include even focus However, would a components. tests with test both GUI that to visual probable a similar both seem in might instance, methods tests [133]. For inquiry 3.3.3) degree GUI section lesser visual in a described such to all or if were methods, even (which GUI inspection the evaluation, methods, of usability modeling feel analytical of and usability look purpose the the verifying However, not 2.2.3), attributes. section is in usability quality discussed hand, non-functional was other also the (as are On satisfaction which appear- user forth. and so image effectiveness, and efficiency, validate positioning, on check might layout focus the tests and metrics CSS, These design, the of the dis- systems). correctness different of operating the as ances, various feel (such or and environments browsers, look supported settings, multiple the play across verifying pleasant on usability aesthetically focus the it’s might by whether tests reported and GUI documented some be Moreover, if could However, it problems. test, usability usability finding tests a on designed engineer. during focus The discovered rather performed. cover- but is tests test defects bug of uncover the a quantity to increasing be the try in can or actively contribute that used, not or method tasks do faults evaluation the GUI the find that require- Unlike matter to verify [65]. no the aim to requirements age, not the in is in does testing specified specified evaluation GUI and was usability upon of agreed testing, what goal as main feasible from are the behaviour users hand, by system’s other performed the the On indicate in not [8]. deviation do ments a problems usability of uncovered presence attribute, quality the non-functional a is usability Evaluation Since Usability and Testing GUI between Differences test the 4.1.2 of significance the as well as differ, evaluation still usability motivations and testing their GUI results. but for created checklist). identical, are the sometimes tests in the specified are which tasks around the tasks of the nature conclusion, the In (which on verification dependent guidelines is design scope with its even synonymous that can of not testing means guise is checklist-based the that and in reminded requirements, manually be functional also performed check should if work It testing testers GUI test. and exploratory to what usability checklist-based similar products, guidelines, a still some both software is with in inspections their differences usability contradict of during possible might on the some guide- guidelines despite or However, design design all [132]. of interface across principles set user respected own specified consistently their these be specify naturally, might to companies have software that some lines, user explanation, designing of for way guidelines By guidelines. the usability Naturally, with on overlay relies problem. necessarily also a not which of do 3.3.3), existence interfaces in (defined the process determine inspection to This usability 3.1.3). guidelines the section to of in similar form described very the technique in is testing design approach exploratory the an analyse further is to is (which possible and It’s testing principles, examples). checklist-based other various among with 2.1 3.2.1 table section in such in of presented respected explained feasibility (as being tolerance the are error checking principles or simply design controllability beyond basic as whether go investigating might into testing and GUI functionality, that each clarify to significant also It’s Evaluation Usability and GUI Comparing Inspection4.1. Heuristic-based Automating 4. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin6 128 / 60 Inspection Usability Heuristic-based of Automation some the guidelines that performed. verifying usability means if which all tests fail problems, would Since of aesthetic tests exception some GUI design). have associated the the must their of design with of visual feel pass, the and then to look violated, tests the scenario, were worst-case GUI called severe a evaluations, (also functional such such during correctness In all task has aesthetic single violated. for design a were possible finish guidelines the to still all where able that it’s was proved considered, sample inspection is user usability the a project in and extreme one no last that one issues, usability contrast, task. of any way perform By cannot users. users tasks by since result, completed user-friendliness, be a in GUI longer decrease as no all strong can where and a considered detect upgrade, is would certain evaluation project usability a a Consequently, correlate, after further failed fields disastrously both tests of correlations. positive results user-friendliness, no the of have how levels evaluation examine the usability To in and posi- increase testing the an GUI in to Thus, increase lead an versa. necessary that vice not concluded and does be testing can GUI it of examples, results tive described previously two task the the on though even Based requirement, the user-friendly. meet highly authenticity. to be user’s failure the to a verifying proven uncover without was Consequently, would performed number. testing be GUI phone can words, or task other email the In verified that for a prove need would via the a tests password of a example, GUI new authenticity the (without the the same checking sending time this after of edited short means in be by very However, only user, can a the passwords user-friendly. in that in very precise password member would performed every their requirement task that change the discovers easily making that to test verification), able usability the was a hand, sample be other would user the example On contrasting second button have successful. A users the regular task. be since the access would issue, performing always usability test of severe can chance the a zero test consists result, almost naturally GUI a button invisible script-based as the A of and presence code, task. source a button the invisible perform an through successfully pressing require to theoretically might order test in GUI script-based a case, extreme more an task or As a one if fail [8]. even would tests around, thus GUI way and associated other requirements, its specified the positively with of true pass conform holds not still same might might The it tests user-friendly, expected. is GUI as working issues, are usability functions severe the of if presence the details in with even requirements, Furthermore, specified system. and of the gray-box list of to black- the behaviour (compared of expected includes information case the still of in about provided amount Even data least tester the the method. the has testing testing), testing, tester the white-box GUI the matter where performing no testing, when software, GUI that hand, the box so users of other system, knowledge the the the prior On that with actors have experience encouraged does unbiased. the or strongly be knowledge by it’s can prior evaluation, possessed have measurements not usability experience the do of and tests case usability knowledge the the of performing In amount tests. the the in performing lies difference more- more perceive usability actual One users their which to of in regardless prone bias designs is a less-aesthetic [134]. users degree than describes of user-friendlier effect visual opinion as This the designs the dislike aesthetic experiments, or [134]. like observed with effect they opinions several satisfied aesthetic-usability much their on how are the users based not from users and Nevertheless, demand the performed methods tasks not inquiry design. the that or of reminded whether difficulty be the with should concerning It confused be design. methods. visual inquiry not usability the should through satisfaction level user satisfaction of measurement The the is example additional An Evaluation Usability and GUI Comparing Inspection4.1. Heuristic-based Automating 4. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin6 128 / 61 Inspection Usability Heuristic-based and of Automation GUI mobile, and of Android capable in [157]. tools testing required written browser research cross those Forrester as the especially particular well as hand, in and testing, other and API applications the applications On mobile desktop testing [156]. testing iOS as support to well product tools as vendor required Windows, and Gartner interviews, of customer research interviews, The expert them evaluated needs, and [157]. user tools, automation as [156] test demonstrations such functional investi- data prominent Research most on Forrester the based of and some Research 2016, Gartner in tool. groups the gated, the keep covering analyst licence to independent of order two type in Additionally, the languages shows scripting column or fourth applications the on Lastly, supported focuses without concise. of It applications, table types web testing. and detailed mobile in the desktop, help all of can testing listing column GUI tool the automated the Thirdly, support of GUI tools tool. (as the of the have whether maintains kind column active and second what still owns The demonstrates currently is column. Platforms" that first "Supported status studio its developing development on the order of whose ascending name tools in the name automation by test sorted 2018), GUI December some lists 4.1 Table Tools Automation Test GUI of Review 4.1.3 forms both other. neglect still the not but replace to correlation, can positive important neither it’s as no Consequently, have testing, level correlation. of fields. Sub- negative usability testing of the design. both degree and for the some tests greatly share with GUI differ satisfaction of software user Sec- results the the to about issues. knowledge Furthermore, related prior usability not of find is amount tests aesthetics the usability sequently, GUI while of defects correctness uncover the mainly ondly, tests GUI conclusion, In following: the in simplified further is reasoning The was decrease other. issue or the this increase affect an with how would dealing examined variable a study section, one there’s this practical in in evaluation Therefore, no discussed currently, usability examples since examples. previous and area, All validating this performed. testing some in GUI research or more of reasoning for informal results need than the more between require relationship would the in GUI be differences on passing can effect The of level no usability number having low (while the consistency a in tests). Additionally, graphical GUI decrease and functional user-friendliness. aesthetics a the visual that in inferior noted decrease with accompanied be a to could lead it might mind, tests in examples these With Evaluation Usability and GUI Comparing Inspection4.1. Heuristic-based Automating 4. Chapter • • • • zn ncekn h okadfe ftedesign. special- the tests of GUI inferior feel passing by and in accompanied look decrease the be usability a checking to the in in likely if resulting izing most However, consistency, it’s visual then all. poor state, at and low tests aesthetics very GUI a functional to affect decrease not drastically does lower level to usability leading poor evaluation, A usability during which users functionalities, by offered performed fewer user-friendliness. be means measured can generally tasks tests, less GUI means passing of passing number of low amount A low or high a both user- with associated low tests. be or GUI might high usability of both level with superior A accompanied be might tests GUI passing friendliness. of number high A Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin6 128 / 62 Inspection Usability Heuristic-based of Automation Evaluation Usability and GUI Comparing Inspection4.1. Heuristic-based Automating 4. Chapter Mco [139] iMacros [138] Functional EggPlant [137] Har- ness Objective Dojo [136] AutoIt [135] Ascentialtest Name Tool ne[155] Xnee [154] Watir Test [153] Professional Studio Visual Functional [152] Testing Unified [151] Tosca [150] TestComplete [149] Studio Test Tester [148] GUI Squish [147] SOAtest [146] SkillTest [4] Selenium [145] Sahi Functional [144] Tester Rational [5] Ranorex [143] QF-Test Application [142] Suite Testing Oracle [141] Marveryx [140] Project Testing Desktop Linux al 4.1: Table oprtv ito cietosfratmtdGItesting GUI automated for tools active of list Comparative enxInc. Zeenyx Developers ware Soft- SmartBear Telerik GmbH froglogic Parasoft Focus Micro (collaboration) Software Tyto Rational IBM GmbH Ranorex First Software Quality Oracle srl Marveryx (collaboration) iOpus TestPlant tion Founda- Dojo Team AutoIt N Project GNU (collaboration) Microsoft Focus Micro Tricentis e applications Web - iOS and Android Mobile: - Windows Desktop: - applications Web - applications Java Desktop: - applications Web - applications Oracle Desktop: - Android Mobile: - applications Java Desktop: - Mac and Linux Windows, Desktop: - applications Web - applications Web - iOS and Android Mobile: - Mac and Linux Windows, Desktop: - applications Web - Windows Desktop: - applications Web - Windows Desktop: - Platforms Supported eko:XWno Systems Window X Desktop: - applications Web - solutions) Studio Visual lar particu- (in Windows Desktop: - applications Web - Windows Desktop: - applications Web - iOS and Android Mobile: - Windows Desktop: - applications Web - iOS and Android Mobile: - Windows Desktop: - applications Web - iOS and Android Mobile: - Windows Desktop: - applications Web - iOS and Android Mobile: - Linux and Windows Desktop: - applications Web - Linux and Windows Desktop: - applications Web - applications Web - applications Web - Windows Desktop: - Proprietary Proprietary Proprietary Proprietary LGPL GNU Proprietary Proprietary License Free Academic Freeware Proprietary License N GPL GNU license MIT Proprietary Proprietary Proprietary Proprietary Proprietary Proprietary Proprietary Proprietary Apache Apache Proprietary Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin6 128 / 63 Inspection Usability Heuristic-based of Automation aesthetically is design the [163]. that by Metrics) validating higher using Tool even into Estimator bar and (Quality the QUESTIM problems raise as visual such research pleasant, detect academic for and through checking pages, created dynamic beyond and tools complex going intelligence Some process artificial to on [161]. order relies in issues such Applitools, brain layout issues them, and these eye of human solving first the the The in in emulating specialise [162]. problem for solutions Sikuli visual software and a [161] Some of Applitools recognition would. as as totally tester such a the human is challenges, methods, a GUI of the way set automation of same own feel test its and GUI look brings the that all of issue verification for Visual different assistance tests. functional provide on remains can still tools focus testing GUI though Even are them of best the of some offer, tools plat- these [158]: or following features languages, the the scripting their in comprehend examined in methods, better briefly differ testing to but different order tests, for In GUI support automating forms. of of level capable Gartner their all (since testing). in an are UI Gartner performance, Web tools as by on automation selected evaluated focuses leading SOAtest only tools industry and the The was applications, desktop among which Windows not Parasoft run to was by tools it [147] the as requires SOAtest Forrester, Functional is by Unified exception [151] leader Tosca be: only and industry to Rational, The IBM groups by both [144] Tricentis. Tester by Functional by Rational determined Focus, were Micro tools by criteria, winning [152] The evaluation Testing automation). the test GUI met (including best field which automation deriving that test when the results in similar leaders very the are yielded tools groups research both of efforts the Consequently, Evaluation Usability and GUI Comparing Inspection4.1. Heuristic-based Automating 4. Chapter • • • vnSlnu n opI u tsol ecniee htsmtms nerto with Windows. integration under runs sometimes, only that it Also, considered challenging. be be should may components it party but third [151]. include SoapUI, that windows options and multiple software Selenium various in with even even The integration method, continuous minimum. allows a replay tool to and this cost Moreover, capture maintenance test the reduces supports that GUI tool, Recorder Model-based this Tosca of Tricentis. feature company core software one Austrian is the testing by tool automation an is Tosca creating, for model-based language for [144]. .NET lacking. support currently Basic drawback, format seem Visual a storyboard automation as the However, test visual a and tests. providing script-based in language executing by Java tests and well that the editing, representing beyond both supports for supports goes It edited also and testing, It be testing, 3.1.5). functional GUI section to automated of in screenshots on method (explained the application focuses replay testing of and it regression division capture implies, automate Software the also name Rational can the the it As by but developed [144]. tool company a IBM is Tester Functional all lim- for). Rational support its accounted not Among not does is [160]. it browser process Opera Furthermore, build the environments. the (since Windows browsers of in web part only as runs triggered tool software, be this other can itations, with tests integration created allowing the by that per- 3.1.5) so for section language in supports VBScript (described tool Testing the testing Functional re- uses Unified continuous (by it the Additionally, approach and testing. model-based [159]), GUI a plugin script-based MaTeLo using forming method. the scenarios replay on test and instance and for- capture for cases was the lying test tool supports generate It This can it [152]. MicroFocus. Besides, Professional of QuickTest part as acquired currently known then Interactive, is merly Mercury which by Enterprise written Hewlett-Packard first by was Testing Functional Unified tool The Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. eas h rmr olo uhtosi efrac etn dsrbdi eto 3.1.5). section stretch in a (described of testing performance kind is some tools be such might of usability tools a goal usability in primary as guidelines the services particular because such more Consequently, or to one referring improvements. However, of evaluation viable inspection. automated some the anal- and support recommended find can get to tools order and such in [182], problems, GT.net by speed have locations, responsiveness developed GUI’s potential and GTmetrix, geographical tool the yse fast, different the check on from load rely to and should to needed possible speeds GUI it’s it’s then connection the if bandwidth that Therefore, internet is [181]. with different thumb possible interaction under of as the rule little of in- as aspect usability usability wait particular one of one users instance, automation on For the focus tools, that design. usability tools the from of benefit list extensive [17]. also testers might an human spections include the 4.2 by unnoticed table majority went the though that of Even violations discovery some the found in to even resulted and Compared framework issues, [17]. the usability evaluators, inspection of human framework, automated by USEFUL an the made through is inspection usability example an web final maintaining One on [18]. focuses issues which usability web discover to analysis tion uoaino ersi-ae sblt npcin6 128 / 64 Inspection Usability Heuristic-based of Automation Also, [180]. perspective user’s simulation a usability from a interface is an VRUSE InfoScent in the Secondly, areas problematic the [179]. pinpoint of model) to modeling part developed GOMS analytical tool automate the of utility particular to the in built demonstrating (and be tool a methods could is tools GLEAN community, instance, that For academic proven the process. have evaluation in engineers hand, other usability the and On researchers around simulation. methods and evaluation of modeling, automation analytical the inspections, marginalised of sort have least tools at usability of Commercial capable were tools all target (since tools data these usability-relevant much). category of this [172]) the capture Naview Furthermore, automated the as [164]). is (such Appsee most planning instance the automated evaluation for other from (as with critique range along automated methods 3.3.3 to these methods section for in testing observed support details to Tool be in relating could described classes). features it were evaluation (which 4.2, usability methods table inquiry on in and focus presented tools tools usability the commercial for from that information only background but the testing, on unmoderated Based and moderated other the both On supports devices. [168] applications. mobile web LookBack for tool testing the all usability competition. which hand, unmoderated [176] its remote WhatUsersDo to affordable and compared in [174], applica- expensive specialize TryMyUI web more [109], on UserTesting significantly focusing include it’s market alternatives However, the Other of very apps. most the mobile among with also and applications, it’s desktop tions [178], testing Morae is in usability assists usability automating that in remote few specialize providing to tools services first However, the Among outsourcing. of for. list form accounted the themselves, still a evaluation Furthermore, are considered usability testing the be 3.3.5). perform can section completely this that in services as defined party capture, all third automated include were are not (which categories does These planning and with statement. associated parenthetical critique, evaluation categories a analysis, usability automation in various the presented the ascend- describes are whose Then, in Methods" methods tools sorted software. said "Supported the is on column by list is the supported the and focus best 2018), methods name, The December tool of of by experts. (as list order usability product a ing and their offers support designers 4.2 Table still among team tools produced. development been popular have the tools of automation some usability various years, the Over Tools Evaluation Usability of Review 4.1.4 Evaluation Usability and GUI Comparing Inspection4.1. Heuristic-based Automating 4. Chapter M T lohudSmltri nte sblt iuainto pcaiigi naviga- in specializing tool simulation usability another is Simulator Bloodhound Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin6 128 / 65 Inspection Usability Heuristic-based of Automation Evaluation Usability and GUI Comparing Inspection4.1. Heuristic-based Automating 4. Chapter rwSga [166] CrowdSignal [165] Attensee [164] Appsee Name Tool opa[177] Woopra [176] WhatUsersDo [109] UserTesting [175] Usabilla [174] TryMyUI [173] Silverback [172] Naview [171] MouseStats [170] Morae Loop [168] LookBack [167] Helio 11 [169] al 4.2: Table .com usertesting B.V. Usabilla trymyui.com Ltd. Clearleft Volkside .com mousestats TechSmith Loop11.com lookback.io Zurb Company Automattic attensee.com appsee.com By Offered opaInc. Woopra Ltd. WhatUsersDo oprtv ito cietosfruaiiyevaluation usability for tools active of list Comparative e applications Web - Win- dows Desktop: - applications Web - Mobile - applications Web - Mobile - applications Web - applications Web - Mobile - Desktop - applications Web - Mobile - Mobile - forms Plat- Supported e applications Web - Mobile - applications Web - Mobile - applications Web - Mobile - applications web - Mobile - applications Web - Mobile - Mac Desktop: - applications Web - applications Web - Mobile - etn ehd cpueand analysis) (capture methods Testing - and analysis) (capture methods Testing - (cap- ture) methods testing Remote - (capture) methods Testing - (capture) methods Inquiry - (capture) methods Testing - focus: tracking Attention critique) and ysis, anal- (capture, methods Testing - Methods Supported capture) and (planning methods Testing - analysis) and (cap- ture methods testing Remote - (capture) methods Inquiry - analysis) and capture, (plan- ning, methods testing remote - (capture) methods Testing - capture) and (planning methods Inquiry - analysis) and (cap- ture methods testing Remote - (capture) methods Testing - (planning, analysis) and methods capture, Testing - (capture) methods Inquiry - and analysis) (capture methods Testing - Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin6 128 / 66 Inspection Usability Heuristic-based of Automation time. over how change comprehend gradually to relevance its important how still to and it’s specific updated, heuristic, it’s [184]. guidelines designated how derived, instance, the [183] first of for it’s scope children be, the for can matter designed Thus, heuristics no applications are However, context. Such e-learning (which specific or GUI. Nielsen a devices, any for Jakob be mobile suited by on touchscreen-based can be thumb applied might guidelines of be guidelines rules general Other cannot usability Some [8]. ten they 2.2.4) the user-friendliness. section as of in such described level GUI also high of type a any reach on best applied describing can principles or system guidelines are a heuristics how usability 2.2.4, section in discussed Development previously As Software in Heuristics of Evolution the 4.2 testing. in usability than in field factor human testing the GUI of the importance in in- the to more the due user present mostly through from is field, functionalities derived engineering efforts usability data automation the usability-relevant for checking capturing Support on target generally experiences. focus tools mainly usability tools while terface, testing for GUI not potential do conclusion, of currently lot In a tools leaves inspection which usability capabilities, hand, limited growth. have other future and the languages, On scripting support supports any they [151] [4]). support Tosca the how where Selenium allow as of even (such point tools might integration a testing and script-based the to party interaction, third user progressed additional of an have of forms integration tools different cover testing languages, GUI scripting different.different script-based widely words, is entirely tools other two by the automation In are require their functionalities also for which checking support tests, the and and GUI guidelines processes, script-based validating distinct However, to aspect similar This level. bit interface. access accessing little the require same a in often process presented would components inspection inspection graphical the complete various makes the a of of attributes automation limited, internal the is the guidelines that multiple conclude of to inspection the easy covering it’s tools inspections. usability usability of automated number and the testing actions. though GUI Even user script-based processing in and lies observing similarity concepts interesting for Both Another need testing. the GUI without model-based conclusions of draw automation and model- to models analytical similar analyse automating is currently background tool the usability commercial methods, no ing inspection. be usability to method. a seem inquiry in there the guidelines though Even and specific method of methods evaluation testing evaluation the the on support constitute focus tools these generally other and market Some users, the of presence on the currently visual require tools the that usability checking hand, tests other the GUI [161]. On interface automate an to across recent components managed more graphical successfully various Besides, of Applitools for 3.2.4). correctness GUI section as the in such to details access tools in deeper discussed GUI providing cap- were on the (which or on tests approach, focus script-based model-based might writing a process on automation method, tests The replay resulting testing. and the regression ture and of commercial GUI, part Most the as interfaces. executed through user usually functionalities Heuristics in are software interest of evalua- validate their Evolution usability of help and apart generally tools 4.2. common tools testing in GUI GUI much that share observed not be do could tools it tion 4.1.4, and 4.1.3 sections on Based Tools Usability Versus Tools Testing GUI 4.1.5 Inspection Heuristic-based Automating 4. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. rnilsdsrb eea rfrne htcnb ple nayue nefc.Wieother users. targeted While specific the a as of such interface. background use and user of age context any particular the a or on on device, focus applied mobile or environment, be certain a can Some with that deal research. guidelines preferences commercial and general academic describe both principles through developed been have heuristics Usability uoaino ersi-ae sblt npcin6 128 / 67 Inspection Usability Heuristic-based of Automation years. the over below: heuristics better listed influential to are numerous which produced guidelines have of usability studies Some These refined use. more of deriving context for the suit funded was research [191]. present more solution they Consequently, usability which discount in a Interfaces", as User Molich method average of evaluation Rolf by Evaluation and heuristic Nielsen used "Heuristic the Jakob titled be authors paper could 1990, computers, seminal computers In their personal that specialist. publish so first technical trained design the only GUI of than in rather emergence goal users key the a and became technologies, gradually information usability in of (described dawn the questionnaires the as of With such use techniques the evaluation and its [190]. protocol, 3.3.3) of aloud section many thinking inheriting of the HF&E, practice walkthrough, Usability of the cognitive [189]. descendent and performance a system elements, proper is system and engineering well-being other human and optimize to humans methods systems and between Ergonomics designing principles and interaction data, Factors per- the the Human training comprehending said with called with replacing of associated concerned field of cost is the necessity discipline and in This frequent development time [188]. the and (HF&E) the well research lowering more as machinery, to for led complex need sonnel, of The use development the weaponry. the in with and personnel two, machines and one new war complex world starting of importance more gained guidelines Design are principles design three These (who [186]. [187]: Vinci (Before below Man) da Vitruvian BC explained Leonardo famous briefly century as now first such the drawing figures the into historical to various inspired Heuristics back influenced got of way later Evolution work dating cre- His guidelines principles 4.2. [185]. design design Christ) earliest core the three Among Vitruvius’s far. are by ated engineering usability precede principles Design Heuristics Usability of Roots Ergonomic 4.2.1 Inspection Heuristic-based Automating 4. Chapter • • • • • • • • • • h 4 e sblt udlnsb ai rvs[193]. Travis David by guidelines usability web [192]. 247 Nielsen The Jakob by usability homepage for guidelines [94]. 113 Microsoft The by Mobile Windows for guidelines interface user [93]. The LLC Google by guidelines interface user [92]. Android company The Apple the by guidelines Mi- interface human by iOS The applications desktop Windows-based for guidelines [91]. crosoft experience [26]. Mosier user and official Smith by The software interface user designing for guidelines 944 The design. the of beauty perceived or The users, Venustas: intended effectiveness. its and of efficiency needs design’s the the for sense, design other the in of suitability and convenience The Utilitas: robust- and stability reliability, its including durability, ness. and strength design’s The Firmitas: Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. suulybte hnltigteue hoehwmn hywn ose[9] oee,h still [197]. he galleries However, photo including [197]. listings see long to for want suited they well many is how general page choose in per pagination user results On search that the of [193]. believes letting number page default than per single appear better a to usually offering results that is search states of Nielsen a number Jakob of the hand, function configure other search to the the user examine to the to specific allow one. guideline to possible modern usability is it’s more one website, Travis, a yet, David by exam- to simpler outdated according to is instance, Or be guideline For an traditional would available. in certain approach a result is if still informal option deduce can more and better guideline literature A a the applying whether possible. whether or conclude method improvement, and only examples the illustrative some not ine age is didn’t this heuristics traditional but which identifying well, in effective is methodology aforementioned The uoaino ersi-ae sblt npcin6 128 / 68 Inspection Usability Heuristic-based of Automation following usability the older of validate consisting to methodology order a more follow In the to evolving. question [196]: possible steps it’s is into six ones, design put new the be derive and might or user guidelines heuristics the usability between There- traditional interaction some challenges. the usability of different relevance sometimes the forth fore, introducing brings keeps along which technologies differ information possibilities new also interaction of can emergence new continued guidelines the the the deduction, [195]. of on that expertise way change means technical By only their This not or users, side. does of interaction human age hand, the the virtual the other of on with augmented context the of also, the or On but tablets, cases, side, of other file. machine devices, In save by mobile untouched of [194]. his areas GUI headsets deleting additional the reality covering before researching of as sure goal such mobile he’s the work a have if previous of mainly user GUI heuristics the the deduced even be recently ask Nowadays, still more to to platform. any enough expected across general ac- is it’s GUI user applications, of game appropriate WIMP type prompt for any to for intended relevant message being currently users rule warning warn this explicit to Despite an is in- [26]. displaying 1986, For to tion) (by back loss dating software become. Mosier, data more heuristics and potential Smith the published of by general, principle the In design context-specific traditional ones. one more Heuristics the stance, recent of the Evolution between more progresses, surface newer 4.2. to technology the began and and differences heuristics some traditional time, more over older produced guidelines many the Among Heuristics Modern Versus Traditional 4.2.2 Inspection Heuristic-based Automating 4. Chapter .Rfieadipoetels fhuitc ae ntefebc andfo rvosvali- previous from gained feedback the on based heuristics of list the improve and Refine 6. stud- case relevant standard on performed a experiments additional of through guidelines form new the Validate the in 5. heuristics usability proposed the explain and specify on Formally based heuristics, 4. usability possible best the to characteristics topic identified the Relate 3. gath- previously the from concepts and that characteristics common to important most specific the applications Describe explore 2. and research, of topic the to related literature Gather 1. ainefforts. dation ies. template. guidelines. previous and analysis studies case data. ered topic. Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin6 128 / 69 Inspection Usability Heuristic-based of Automation work. their designers in GUI prioritise for before to important use guidelines it’s of and which context inspection, well usability the choose a account to in into include take to each to heuristics contradict if experts which would speaking, usability selecting heuristics Theoretically for aforementioned second important two elements. a it’s the five Therefore, while similar, than other. are together, more options grouped have menu be five not than should should more menu some the options a instance, in menu account For that However, similar into claims [201]. that heuristic take use. other state each of to might contradict contexts updated might heuristic newer regularly guidelines one or usability be options modern to two design even needs modern situations, therefore more and by heuris- brought change, usability traditional potential to of relevance subject the that is deduced be tics can is it examples, guideline previous Nielsen’s context. the different use, on a Based smartphones is of guideline which context modern applications, the mentioned desktop in the for one applicable though traditional component still Even the navigational the [77]. of specific map. display opposite the site the consistently highlight usability: the states to preferably in web to location not "Designing user’s and designers the book pages, web indicating web his encouraged the in all he Nielsen on where Jakob other navigation simplicity" by one on of traditional stated them a practice was with repeating The contradicts which avoid directly 1999), 2011) and from from homepage (dating (dating guideline the modern on This advised only [200]. was it pages components where smartphones, navigational for guidelines display usability to web and the in scrolling included well. minimal is as require example revised Another should be site should a guideline particular contradicting that that directly stating Therefore, is Travis technique clicking. by scrolling heuristic definitive infinite usability the more web that results. a another reminded search draw offer be to also to how should options on it designers Moreover, both advising compare heuristics usability and continuously all examine refine that and should means conclusion that additional studies This However, means user techniques. which [199]. traditional and results, area than search research user-friendlier pagination of be the currently page in might first offered down the scrolling anything 91% beyond that on go indicate click not [199], don’t do Dijk easier Van they search extension, and a by Deursen Van performing and by users complex, studies cognitively of Furthermore, less Therefore, perform. it’s dynamically). to steps, requires loaded users only less are for method results requires second the scroll the then, infinite hand, (and the other down the since scrolling are On which is number. results, which page more action the loading user on before one shows click actions technique a user and scrolling two down least infinite scroll at a the require would with method 4.1 first figure the scrolls screen. that from user the pagination the on traditional more anywhere the the displayed is longer is Comparing grow section scrolling displaying and pagination Infinite no results and while, more retrieving the load for results. All would down down. search page scroll The of continuously [198]. already number to content have the users more options allowing on pagination technique restrictions technology, pagination for and a need software the of beyond improvement Travis’s evolved following constant is the website to Ebay the Due that Heuristics noted advice. page of Nielsen’s be red Evolution result taking can drawn of not it search number a 4.2. and 4.1, the and common guideline, representing figure value options on a default Based pagination fixed of a the results. with area displayed menu contains dropdown bottom image a to the The attention bringing of website. arrow screenshot "Ebay" partial the from a retrieved shows 4.1 figure The Inspection Heuristic-based Automating 4. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin7 128 / 70 Inspection Usability Heuristic-based of Automation many how finally, event and manually. with checked means, automated be other be only through can should automated and them guidelines be minimalistic of many can a how by guidelines provide indicate software many would and how interface study sequencing, heuristics user the processed for circumstances, selected of guidelines these the set design Under and clear 944 applications, [26]. the desktop Mosier from Windows and derived hy- to Smith be the automate restricted would of to is heuristics correctness order use of the in evaluation of set testing heuristic context for in considered Therefore, applied the be [202]). it, can approach 2017 event-driven October an of that 90.77% Furthermore, as pothesis (holding devices day. share desktop this market on to system the operating continues of used and widely most first, research the started counterparts, is Windows applications mobile Microsoft and WIMP web for their guidelines than design history in longer particular a have a applications specify desktop and Since heuristics, of a set in stable sequencing a might use. event select and of of first context time extent to over potential important evolve the it’s guidelines checking inspection, However, an before usability following Therefore, form. a involve context. a that of the filling heuristics with automation as some change such the of events, sequenc- inspection improve event of the can series GUI automating ordered technique, for sequencing testing considered event GUI be GUI a can being ing that Despite hypothesized inspection. is usability on it heuristic-based performed thesis, is this task a In while correctly behaving are properly components to GUI. order the graphical in GUI events different automated GUI the an sequencing during if for scenario support check usage requires a testing emulate such GUI to other Script-based is each goal [64]. follow Their test that events inputs. can related data of it and chain calls considered. Therefore, method ordered is as an sequencing [18]. represents event events GUI GUI more [17] of of gap sequence use the A the [16] close that, would the achieve violations [15] To despite guideline evaluation. support more heuristic [13] of manual tool detection with that notable the changing automating have that in currently theorized studies not be academic do few users. inspections a that actual of usability from observed efforts collected was hand, data it relevant other 4.1.4, usability the section capturing On in on most discussed the tools, focus usability efforts of automation review to the contribute on greatly based would Moreover, usability automated and a approach, successful of process. traditional a design cost the iterative Consequently, the an than simplifying sample, cheaper alternatives. even user cheapest be a would the of inspection presence among the is require evaluation not heuristic do inspections usability Since Heuristic- automating in Sequencing Event GUI of Potential and Value 4.3 Potential Sequencing Event GUI 4.3. Inspection Heuristic-based Automating 4. Chapter iue4.1: Figure ae sblt Inspection Usability based cenhto aiainae ne erhrslsrtivdfo h bywebsite Ebay the from retrieved results search under area pagination a of Screenshot Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin7 128 / 71 Inspection Usability Heuristic-based of Automation the in page another picked randomly 60 changed had the greatly he of have if out words, might other 54 results in that the (or document). However, concluded guidelines he the other [204]. chosen in list), valid he pages the still had six of were choosing section reviewed randomly arbitrarily different (by he he a sections guidelines where six to Nielsen, the belonging Jakob of each by one study document, each 2005 from a guidelines In durability ten sources the [203]. largest selected 1986, praiseworthy the to is of back guidelines dates one report list these the as their of though user, even makes Moreover, the This heuristics. from levels. design skill required traditional different of load was with memory objective users the their supporting and minimizing design as guidelines by of well 944 learnability includes Division list list and System the The efficiency Electronic promote [26]. is the Force to an candidates by Air sponsored States from best United Mosier, starting the the and of by of Smith the (ESD) by possible One to interfaces restricted is user principles. is set for use design a guidelines of traditional such context of Deriving the source heuristics, applications. precise existing of desktop set Windows minimalistic of a usability grasp to effort Heuristics an In Usability adequate Structuring and Deriving 5.2 to selected, experiment. are practical Windows a under in automated run be can can guidelines that gathered previously applications the software of sample of guidelines a Consequently, couple if the check sequencing. a event the which and to GUI tool in according with them testing compatibility time these classify their a heuristics, of the and of automation automation, set at the their reasonable analysing a of technology facilitate derive feasibility to to information required order in first In is art 4.2). it guidelines, section context the in the of explained on (as depending state derived durability, an were the in their inspected even and guidelines and of use, focus, list their The of amount, 3.3.3). their section in in manually more varies described performed evaluation is generally (and is expert method, usability inspection a usability by cheap a being evaluation, Heuristic Addressed Problem the of Summary that claim 5.1 the of validity the testing evaluation. heuristic of a to purpose of has the efforts heuristics automation for those improves particularly of sequencing selected sample event arranged GUI tool accepted an widely a of and in first sizeable automation verified a practical it’s be from the a sequencing, heuristics Then, from event of guidelines. set benefit GUI of relevant can called list and minimalistic inspections testing, a usability GUI derive of to script-based needed field for specific reserved the usually in technique automation that demonstrate To Sequencing Event GUI Automating with for Inspection Concept Usability of Proof Heuristic-based 5 Concept of Proof 5. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. etoso h aefntoa ra o ntne h udln eeecdi hi ae s1.3/32, as paper Conse- their sub- in different referenced [26]. even guideline the list or instance, sections, For complete multiple area. a functional in same include offered the to of be sections intended would guidelines was identical section almost every quently, because another, to section Redundancy: follow- the on based is consideration from remove criteria: to ing heuristics which filtering behind process Applications The Windows for Heuristics Usability Filtering 5.2.1 turned which those as well as designers, select GUI to for necessary intuitive it’s are obsolete. became Therefore, which or guidelines [26]. context-specific, exclude and use too specific applications, of either any desktop about context Windows for found for their designers not, guidelines in that relevant would discovered applicable currently they In some guidelines surveys, while the own [26]. their relevant of software of 40% be one particular not would through any Mosier, fact, guidelines designing In and the when application. Smith of applied to some be according can words, However, guidelines other single proposed number. each their reference and of computers, unique all mainframe own of its interfaces user has for guideline developed originally was report The uoaino ersi-ae sblt npcin7 128 / 72 Inspection Usability Heuristic-based of Automation Heuristics adequate Structuring and Deriving [26]: covers 5.2. list following the This in the heuristics. described briefly as usability interaction selected of user-system are set of relevant Mosier areas and and functional different minimalistic Smith six a by deriving interfaces for user point for starting guidelines design 944 the Therefore, Concept of Proof 5. Chapter .Dt rtcin h aapoeto eto nld 0gieie.Ti atfcsson focuses part This guidelines. 70 include section protection data The Protection: Data 6. cover They principles. 83 holds guidelines transmission data of list The Transmission: Data to 5. how describes section This guidelines. 110 has part guidance user The Guidance: User 4. con- Sequence guidelines. 184 include control sequence for section The Control: Sequence 3. data output to how explains It guidelines. 298 holds section display data user The assist Display: to Data how 2. describing all guidelines, 199 has section entry data The Entry: Data 1. h euiyo h nomto yavsn nhwt rtc tfo etutv srac- user destructive reference from the guidelines) precedes it its "6." protect of prefix each to the for (Also, how number access. on unauthorised advising and failures, by system information tions, the of security "5.") with the start that codes with referenced are guidelines data (These receiving users. and other sending from interfaces user and present to to how discuss and communication, email the in instructions formal even "4.") and with start labels, references prompts, their (Additionally, messages, material. error help as such data present (The transactions. user-system "3."). with terminate begin or both always covers guidelines interrupt, part these This start, of system). codes that the reference next from actions response the system a to by and followed transaction mean- user is interaction, user-system which user-system action of one user unit from any functional smallest ing transition the to the referring behind transaction a logic (with the to refers trol output. "2."). such with from start exclude guidelines or these of include, numbers to reference information the additional (All what including user, the such to to "1."). respond with should begin system document the the in how codes and reference system, their the all (Also, in input. data of input involving actions h rgnlgieie aesm eesr vra ntercvrg rmone from coverage their in overlap necessary some have guidelines original The Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin7 128 / 73 in applied Inspection as Usability sequencing, Heuristic-based event of GUI Automation of help the appli- desktop with Windows automatically of verified context be general can the in which relevant cations, heuristics, of consists Sequencing list Event following by The Automation with Compatible Heuristics Listing 5.2.2 classified and accepted followed The categorisation. being its element 5.4.1. each behind with reasoning section 5.2.4, the in as of well details rules explanation as in an these 5.2.3, by evaluated of 5.2.2, sections is number in result the described This reduced are heuristics rest 85. the to excluding 944 is and valid from guidelines longer relevant no most guideline the any Retaining of automation the while Therefore, suggestions query. search needed. search providing not be his would available typing that example is is feature, Another user username auto-complete text. the the chosen typing for is the applies user same if the The check as request. example, activates can registration For system a to correct. submits and the longer user no data, task, the is entered before registration This processing user action. start other initial some to an of action effect during side ENTER a a explicit as that an processing states initiate take Action" not to ENTER required "Explicit always 1.0/9 reference is with user guideline the instance, For sideration. obsolete: Turning classification. systems the software from in omitted secrecy also and are were security guidelines (and of Such usability importance use). of the military data focus to for simple main due designed cover the report they not the whether are in they security-oriented, of included messages, consid- are probably set of are that the encryption applications rules minimize complex Windows for to or most applies order validation in logic In applied same guidelines be maps. The 12 can display drawing, ered. that to for guidelines how guidelines general on 19 only guidelines offers heuristics, 18 report and the alone, instance, flowcharts For for use. of contexts in specific guidelines context: such specific include highly a to Having needed not is it designers advise Therefore, to anymore users. be necessary by would not classifications. edited example future is be An it cannot as designers. they labels Labels", GUI as that "Protected for guidelines 1.4/7 intuitive became coded such currently guidelines guideline include are other the and to some time, Besides, needed over not systems. accepted operating is widely is modern It the what by example, covered of double-click. For automatically predecessor the are 1980s. the as the describes known in Pointing" commonly do for currently now Activation to systems "Dual used operating 3.1.3/6 computers Windows referenced mainframe because guideline than default, better by much endorsed users now support are report the from tics practice: common a being Presently to application. necessary voice-based therefore a application, is in line It interaction command user-system a application. better in Windows ensuring commands regular appropriate or describing a those in as setting such guidelines GUI exclude modern a to GUI: apply application’s not desktop a on this Heuristics Inapplicability in adequate Structuring proposed and classification Deriving the in 5.2. once only It guidelines Actions". Destructive such of include paper. Confirmation to "User ad- sufficient called same the therefore 6.0/18, gives reference is and with subject guideline same the the on as focuses vice Mode" DELETE in Actions "Confirming titled Concept of Proof 5. Chapter ersista eaeoslt vrtm r aual rpe rmcon- from dropped naturally are time over obsolete became that Heuristics h ithsahg ubro udlnsdsge o various for designed guidelines of number high a has list The nomto ehooyporse ofrta ayheuris- many that far so progressed technology Information ag ru fteofrdhuitc do heuristics offered the of group large A Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin7 128 / 74 on reliance Inspection requires Usability test Heuristic-based of a for Automation such window can of help user automation the the The whether opening or (by entered, stopped. sequence data he entry the where losing data operations partially a his or in completely continue interruption to an lead would if [26]. instance), action check other can some of test effect side A the be not should Cancellation entry. data Action: CANCEL Explicit 1.0/11 of attention the the interrupt keep not should should that that interactions for operations delay seconds for ten the second [205]. should of measured, one user limit limit operation the to a delay the and up the of thought, time pushed with of nature the be flow the associated However, can user’s on It elements Depending stop. vary. GUI would can [119]. or allowed timer seconds detected, 0.2 the be is then always state GUI, not system the response the for on system GUI needed the appear time the when the response Once measuring instance, starts For timer respond. the timers. then using to with, by interacted automated is be question in can event guideline this of verification delays The feedback displayed [26]. operations, seconds normal 0.2 For surpass response. not computer should in delays by slowed not user. are the as that manner Response: script same Fast test the 1.0/4 the a and by words, order mimicked same other be the In can in GUI screen twice. the the on entered on elements tasks was GUI these input the perform described trigger same to the would previously the user with the the if performing compliance for checking for needed verifying while, steps needed test the events a all GUI automate and of examples to sequence scenarios, two the these feasible Were calling been anyway). by have signing-up heuristic process would for sign-up reason it the his confirming order hypothetical, that’s in In (since step not login. last user a the the for during sign-in possible again immediately also be data can Then, same to it the application. guideline, enter the an to violate asked in to not is would signing-up expected to example, user be of natural the not part more registration, should A the as user confirming succeed. password the after to and items, task username the filter a of to part part entering is first second second be the the the if the for and instance, query simulate items, For search to for the respected. used search re-type was be to guideline they could is the if events task if again GUI a verify task of of and previous sequence fashion, the A ordered from an data task. in initiating same second tasks and the the task, re-enter the perform first to verify to his need required finishing not to are after shall possible Then, user Said the is needed. behaviour. one, data it user second enter mimics a and guideline, that task, events the one of perform with sequence can a compatible user creating is by recommendation question afterwards the in of data validity entered task that the access that [26]. can tasks Provided system other the for that or and task once, same the only for data particular any GUI a Once: Only Entered Data test 1.0/1 a from events GUI of [26]: series following the ordered the are an certain scenario at guidelines usage a triggering arrive These hypothetical re-enact by to script. Such to automatically order need guideline. would the reconstructed in with test GUI be compliance GUI the would verify script-based could with a it Heuristics words, interact so adequate other to scenario Structuring In user usage these and assessed. checking a Deriving be tests for the to then 5.2. of need state guideline, All the GUI category. said common this of in in description belong share complete guidelines developed such heuristics or report that partial believed the a is in it either number reason by the reference followed its Mosier, with and starts Smith heuristic by Each testing. GUI script-based Concept of Proof 5. Chapter h ytmsol cnweg aaetyatosrpdy ota users that so rapidly, actions entry data acknowledge should system The ti eomne oesr htaue ol edt ne into enter to need would user a that ensure to recommended is It srsol efr nepii cini re ocne a cancel to order in action explicit an perform should user A Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin7 128 / 75 Inspection Usability Heuristic-based of Automation [26]. segments data stored the recall and identify use, later a in then and Text: segments, Used Frequently Storing GUI 1.3/24 a in automated be can works use, it of if context Evaluating certain GUI. a the in on relevant test. provided is be search should during option search letters this a case then when upper option and selectable lower a differentiating as If GUI the on case [26]. the performed specify is to operation allowed be should it users, Search: for in Case Specifying 1.3/11 checks test the Finally, form. times, case multiple iterations. lower between query and results upper search search in one of are similarity of letters the input different iteration, through each automatically in [26]. checked while items be data searching can when guideline equivalent This as treated be should letters case Search: lower in and Equivalent Case Lower and Upper 1.3/10 checking and system, the to data data. the such sending with forms, deals numeric systems various the of accounted how input are by points decimal system, [26]. whether the check in alternatives could for equivalent test as automated written them a treat sequencing, and event With value, integer an of end the at user-system point Optional: one Point least Decimal at 1.0/28 emulate to needed. need is sequencing would matter event No script GUI groups. Thus, test smaller input). the into data data tested, GUI. is the the (which is divides by interaction guideline format correctly formatted the this is if way data check entered would the test the test The if the field. check words, input then corresponding other field, one In that only by in has accepted fill data and first long fields, should this text if script multiple However, by on be input. partitioned can one is as guideline data system long this keys, the the verifying if license of test software way to long One sequencing (IBAN). event entering using Number for Account fields Bank input International designing an when or apply [26]. still display can its principle and entry This its simplify to groups shorter into tioned Items: Data Long Partitioning 1.0/16 operation successful tested). a feedback either of to type leading the events on GUI automation (depending of The one sequence operation. erroneous a ongoing an start the if or to of checking have state by the would or user tests action, the such user to of failed) affirms GUI (or the successful (or on some message notification of text confirmation result a a a as detecting displayed by message) sequencing [26]. confirmation error with user an a verified the inform be instance, should could for message recommendation with, error This an entry unsuccessful, data was entry of data completion If the Heuristics message. acknowledge adequate Entry: Structuring should and Data system Deriving of the Completion 5.2. for the Feedback with 1.0/12 compliance GUI’s the challenging scenario a re-enacts that events guideline. GUI of sequence a Concept of Proof 5. Chapter tsol ealwdt ihricueo ims decimal a dismiss or include either to allowed be should It fdfeetaigupradlwrcs etr simportant is letters case lower and upper differentiating If fln aams eetrd htdt hudb parti- be should data that entered, be must data long If sr hudb loe osoefeunl sdtext used frequently store to allowed be should Users fadt nr rnato a successful, was transaction entry data a If nessae tews yaue,upper user, a by otherwise stated Unless Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin7 128 / 76 Inspection Usability Heuristic-based of Automation needed is sequence data GUI finishes the a he way, since once Either fail error pass. test. the would would the of test test automating notified the the for only then task, is entered. interrupted), the user is being the of data (without if completion entry erroneous Otherwise, the an violated. after before then event displayed is message is guideline error message an error of appearance the the If trans- for ongoing wait The could entry. test [26]. data messages A of error completion by the interrupted at be user not the should to action displayed be should message error Messages: Error decisive Non-Disruptive the 1.7/3 after evaluation. and automated before the checked in for required to simulated this is has be sequence testing element to GUI For graphical has a the Therefore, element bigger. of GUI action. made user properties a be the expending can and or graphics test, resizing certain the of or action resized, user such be the questions answers can recommendation, and window displayed, the are elements whether GUI the way as the of toward [26]. critical oriented task GUI is the positioning guideline of the This part simplify the would expend area to display allowed the be Enlarging should task. users GUI, the on elements graphic Positioning: Precise for to Zooming sequence 1.6/5 GUI a on rely to need also a would contains script scenario test the scenario. since the the Naturally, actions, navigation,simulate it. system tabbing controls of and action absence user key or of expected cur- presence sequence multiple the the the to if of (with related checking location form only on the not a also checking is but filling and principle key, by this tab of instance, the focus for The triggering sor. while GUI, manner, a automated on an automatically in fields) tested be naviga- can tabbing such process provide [26]. The not it should requesting computer user The the next. without the automatically to tion field entry data one Fields: from move Data to Tabbing Explicit simulate 1.4/15 to needs the heuristic matter events. this No GUI of with sequence process. compliance corresponding the reviewing evaluating a invalidate or script starting to by test task, button actions a the user "Cancel" taken, with a actions advancing have of for also task. series could buttons installation software It "Back" a and choices. for "Next" previous GUI have interface well-designed an would a of window instance be an The would tested), recommendation, be to this item sequencing by event particular require abiding any (which filling change form and in save, than Other cancel, [26]. review, action to entry data allowed his be finalising should before user the filling, form steps Interrupt: multiple Flexible sequencing. of 1.4/2 the event consists through with done task fine be login works can the it login testing since the it), scenario, submitting whether optionally and this Heuristics check data be adequate In of can can Structuring (input password and test password. the Deriving (for the and if login checking Furthermore, 5.2. username second as the stored well use. on as user repeated text), can the default for sequencing by a action, saved entered as login previously a it name in displaying the example, by remember For example, can script. system test the a if by verified test be also can guideline This Concept of Proof 5. Chapter hnvrosiesaeetrda n igetascin uha in as such transaction, single one as entered are items various When fdt nr eed nteeatpaeeto some of placement exact the on depends entry data If hndt aiainietfisapsil ro,an error, possible a identifies validation data When sr hudtk xlcttbigato nodrto order in action tabbing explicit take should Users Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin7 128 / 77 Inspection Usability Heuristic-based of Automation test this GUI automating the Therefore, and page). ensure necessary, content to sequencing. if next times event down the with multiple display done activated scrolled be to periodically be can is be to (which to described behaviour has has as correct pages displayed dynamic, its new or data calling (traditional of used for pagination list responsible of user event the type the 4.2.2), the to presented matter section pages, No in multiple time. on the set at data page large one a dividing [26]. definition another into by to partitioned requires page be Pagination one should from move items provided easily data be to should those users procedure then enable navigation convenient to frame, a single Furthermore, a pages. in displayable separately presented be cannot they that Displays: Crowded Paging 2.5/4 attention. user attract to manner elements particular different a if in checking marked then comparison, are the the items with in initiating both start test, highlighted would in under sequence are event items differences The two the automatically. their performed of whether be selection and can user, compared the be to not manner can will some items users particular that two note if to Testing better important would is analysis automated It an [26]. other. providing users Therefore, the assist comparisons. to during next accurate directly be one always display should GUI Comparison: the Direct for Paired Items 2.3/5 logically items data how of understanding the other. prior of each as testing to well [26]. relate automated as them are Therefore sequencing, of event attributes events. both GUI GUI both display requires to multiple if process activating enough number be means identification should data them should multiple its of user Entering and one a name entering example, item’s Only the negative system. an needing a the both without As in enter data unique separately. relevant to attribute other required data retrieve be each to not for able ask be repeatability should to system user the entries, data Data: vious Redundant of Entry Automatic entered 1.8/10 the if task. checking next finally the during then review, by and for start system, GUI would the the guideline, to in the it displayed automate submitting is help by data would followed that data, events of of input sequence the the scenario, a instead such [26]. it, data In review those can remember users to that so user information the relevant requiring any of display and retrieve should system Entries: the Prior of Review User 1.8/9 every The checking tests sequencing. of series event step. corresponding GUI validation by accompanied single through be testing can data automated of with validation incremental compatible highly would [26]. is optional users guideline experienced capability This of this way making the validation in Therefore, item-by-item getting Heuristics an without adequate such users. beginners Structuring hand, experienced the help and transac- other about down Deriving the confident entry slow On not multiple 5.2. would are item. into who process data users divided each novice be with for learnability associated could the requirements tasks improve their might option and This provided, tions. be could validation data Validation: Item-by-Item Optional 1.7/7 Concept of Proof 5. Chapter faGIdsly ag muto aaiest h point the to items data of amount large a displays GUI a If fdt nee noetascinaeue naohrone, another in used are transaction one in entered data If ntecs fnvc sr,a pinlie yitem by item optional an users, novice of case the In hnapi fdt tm aet ecompared, be to have items data of pair a When hnacsil aai oial eae opre- to related logically is data accessible When Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin7 128 / 78 Inspection Usability Heuristic-based of Automation Either task. the tests. of these disable completion automating a meantime, the for display needed indicates the is to bar in events be progress GUI and of the would operation, sequence until example complex a GUI way, second a the A processing with result is is interaction the processed). operation system until user (or disabled first the displayed briefly the while is is until bar button action repeated the progress result if submit be to check not last can second should the this one test action of to than user sequence more this possible that demands A and handled. that the response, action to system button delay a submit processing in a that instance, indicate for should Considering, GUI the then entries, [26]. prior user process to time system Lockout: Control Indicating it. 3.0/20 with associated are that events GUI with corresponding sequencing, more event or one GUI having using transaction requires each [26]. automatically in sequences engaged he’s transaction or transaction different display, any Testing and with entry associated data guidance to for related ask transactions can performing he when freedom some have should user Control: Sequence required. Flexible is events 3.0/1 of be sequence to a ele- need that window events means GUI separate which two in time, least same displayed at the possible, be at test can account simplest data into the taken of in result, types a different As if simultaneously. to ments check allowed can be test single should automated a user on An shared the be then to have time, would same that perspectives the [26]. or display at windows data jointly separate viewed choose to and specify need would data of types Windows: User-Specified 2.7.5/3 event so the it, GUI step, catches the data. and expected first then response the called, a system with is compared the As be for system can listens the data setting. results to the prediction test data the relevant measured particular displaying the actual for that sending responsible the for for comparing values responsible by event expected done GUI the is with recommendation values Verifying this manually. prediction of done be implementation not correct should of correctly the path works function flight prediction predicted the the [26]. whether display Checking fluctuation to market expected be up- display would dynamics. to examples predicted data or some representing displaying airplane, use, models an appropriate, of certain context of when analysis the consider, automated on to an Depending on advised based is states data it coming changes, data complex to Display: Prediction passes. 2.7.3/9 If test highlighted. the still then is user, element the graphical of the attention the if the cause check attracting to and longer actions issue the no perform the once is first removed fix it can be then test should highlighted, automated highlighting be an the to guideline, Heuristics GUI, error this adequate the longer check Structuring To on and no highlighted corrected. Deriving is is is error error 5.2. it an instance moment an the for If dismissed be should highlighting such [26]. required elements, display certain of Highlighting: Removing 2.6/2 Concept of Proof 5. Chapter nodrt situesi nesadn n fetvl responding effectively and understanding in users assist to order In hnhglgtn sepoe oepaieo h importance the on emphasize to employed is highlighting When hni antb nw navneta aydifferent many that advance in known be cannot it When h euneo eddue cin hudb eil.A flexible. be should actions user needed of sequence The hntedt nr utb eae nodrt iethe give to order in delayed be must entry data the When Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin7 128 / 79 Inspection Usability Heuristic-based of Automation is It actions. user multiple of consisting it to guidelines. due described sequencing previously event the with to tested similar users be then can loss, task data This cause would [26]. restarting action If the restart sequence. Afterwards, the the sequence. confirm of transaction should beginning current the the to in return made would entries GUI any cancelling in result first will Option: RESTART 3.3/6 sequencing. event GUI opera- on of relying series by a automatically performing tested about display be is therefore the recommendation can to this and guidelines, tions, returning previous allows two the action like backup Just [26]. The terminating transaction (without data user backed-up application). the the by the with performed associated work closing the or to related transaction states the data all system the in persist Option: BACKUP 3.3/4 course of can it cancelling), then sequencing. (starting, event operation GUI of with sequence tested a be involves made test were under that task [26]. changes the state any previous As invalidating its of to effect display the current have the will restoring This and users. to provided be should Option: CANCEL 3.3/3 sequencing. event on action relying user by a tested and be transaction, can one scenario within This steps transaction. multiple that having involves interrupting question in task [26]. the requirements task’s Since that for appropriate manner the in transaction, Transaction: of Interruption User 3.3/1 guideline. this with sequenc- compliance event checking GUI context tests Therefore, the the actions. when automate user help then, of requirements. can sequence the commands, ing with a those accordance performing perform in require again, to all the examined scenarios him when are These allow absent) options even such not (or of does availability disabled that the are changes, context options various a if in verify is can user test automated an case, this In [26]. task current their for relevant and applicable Offered: the Options Available expected. access Only is 3.2/10 then what presses, with conform key indeed of they result if check the to to with elements needed associated graphical are those elements GUI techniques of on graphical properties sequencing dependent catch event not that and is Therefore, of test for a feedback) is. wait in (or feedback entries result key corresponding simulating corresponding the though the elements, Even asso- whether GUI. keys the checking shortcut on and appeared the Heuristics option, action pressing adequate menu Structuring simulating and a by Deriving with a automated ciated select be 5.2. can would recommendation keys this two [26]. Checking users pressing experienced simultaneously assist better example, and For faster, option entry. menu keyed by selection Entry: menu Keyed by Selection Menu 3.1.3/7 Concept of Proof 5. Chapter hnapial n prpit,teGIsol rvd h pinto option the provide should GUI the appropriate, and applicable When fpsil,teoto orsatatascinsol epoie.This provided. be should transaction a restart to option the possible, If hnaporaet eunecnrl h pint aclatask a cancel to option the control, sequence to appropriate When sr hudol epoie pin htaeactually are that options provided be only should Users tsol ealwdfraue oitrutacurrent a interrupt to user a for allowed be should It ti eomne oalwn sr oaccomplish to users allowing to recommended is It Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8 128 / 80 Inspection Usability Heuristic-based of Automation time, second a transaction action. the the committing of message, execution by error successful followed possible the the A receive checking mistake, finally transaction, heuristic. the then erroneous this correcting an checking by for complete proceed suitable first then is test sequencing the event have task, would the sequence of nature the correct to immediately to Due users allow should GUI [26]. the directly identified, were information data the the in errors some but user Correction: and Data loss Immediate re- data 3.5/12 possible is of action user logout the logout. a notifying the as appears, with message proceeding then before a simulated, confirmation if for be check [26]. asking lost can would user be test transaction the will the data inform pending unsaved quested, should that a or GUI sequencing, completed, the yet event not found, With was is task a any that if message and advisory an transactions with pending check LOG-OFF: should at system Loss Data Preventing 3.5/11 preceding behaviour the a if expected. of check as then completion is and the requirements), GUI after the the to action of according control reversible is reverse task the the (provided simulate task can events of sequence A [26]. command undo the invoking by Actions: possible Control was Reverse it to where UNDO resumed 3.5/10 be can the task trigger the if states, check transaction and relevant application, preserving the before suspended. restart GUI a then the at action, system check suspend the can to [26]. test returns task automated user suspended the the An current resume when the to of Then, preservation permitted the system. is be he the would time, exits task later user a suspending the of when result status The transaction task. a on work their pend done Option: be SUSPEND can 3.3/10 it if checking test GUI automated sequencing. the event on task, rely GUI [26]. to certain transaction have a interrupted would pause the inter- to with not associated appropriate should it’s logic actions control interrupting If continue or first and entries of pause data result The the the modify sequence. have or transaction would fere a This resuming later transaction. then a and continuing and pausing for options the Options: in CONTINUE needed and PAUSE is 3.3/8 events GUI of sequence check a to Therefore, need would test task. The the GUI. the ending Heuristics through test. after adequate ended and Structuring be can and before of task Deriving effect GUI ongoing an The [26]. the 5.2. if sequence test loop. transaction to a current possible the of is of It part concluding as the performed be repeatedly should action is ending transaction the said when notably, most and Option: END 3.3/7 Concept of Proof 5. Chapter hnpritd h U hudpoieteoto oedatransaction, a end to option the provide should GUI the permitted, When faporae h U hudpoieueswt h en osus- to means the with users provide should GUI the appropriate, If hnadt nr rnato a encmltdb the by completed been has transaction entry data a When fsqec oto losi,teGIsol provide should GUI the it, allows control sequence If h meit eeslo sratossol be should actions user of reversal immediate The eoeaue osoffo h plcto,the application, the from off logs user a Before Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8 128 / 81 Inspection Usability Heuristic-based of Automation [26]. actions past review them help would This displayed. be to Transactions: transactions previous Past of Record data, 4.4/22 corrected guideline. the the just of successfully, violation than completed a more is constitutes re-entering task this requires the as fail, GUI If would the test time. other if the second However, the then dis- a touching and pass. data without would appearance the input test the problematic submit the the After finally correct then input. can and test erroneous fields, the one data message, guideline just error this with the testing entry events of of data missal sequence completing repeat possible to by a have filling, start not form can of should context and the error, instance the for caused Taking that portion data [26]. the task fix entire to an have only should he it, Errors: Entry of Editing User 4.3/15 error an dismissing after field use erroneous to first possible the is in it located guideline, is this cursor checking message. the for [26]. if needed error test is an to actions in sequencing resulted system event that and element user GUI of the series at a positioned Since be should focus cursor the user, the by Error: Following Placement Cursor 4.3/13 number the shows test content message automated omitted. the an were if make details check to whose to test errors enough the additional be context, for of possible this would also In message is boxes). error it messages Additionally, the one additional at pass. only 4 errors" of displaying more appearance of 4 errors the instead "and additional message, wording of detecting error the number the instance first to for the hint including of error could (by end one it details only and the error, if omitting However, first while the fails. found be test many should the when it system, then returned, the identified, is by are message prompted messages are multiple messages If error many occurred. how errors checking error by complete tested consequently be can or This together displaying [26]. without error single user every the for notify messages should GUI the then Messages: Error Multiple 4.3/8 the on inflicted changes the all detect The correctly status. can system in listening, the event changes to GUI status. for modifications the listening different if event triggering successful GUI are are events tests a GUI having other require while would status, automation the with status the Checking [26]. times all at GUI the on Status: Indicating 4.1/1 events GUI of user sequence shorter a in a represented if be However, can it confidence. then tested. user, Heuristics with and experienced adequate predicted for Structuring offered be and is Deriving cannot path users action 5.2. skilled of behaviour would The [26]. that procedures modes guidance or default paths alternative by-pass present to should users GUI allow the users, experienced down slow might Guidance: User Flexible 4.0/24 Concept of Proof 5. Chapter hntesau ftesse srlvn oues tsol eindicated be should it users, to relevant is system the of status the When fmlil rosaeietfidisd n obnddt entry, data combined one inside identified are errors multiple If ftcnqe rdsg hie soitdwt srguidance user with associated choices design or techniques If nea ro sietfidadteue sakdt correct to asked is user the and identified is error an Once tsol epritdfruest s o eod of records for ask to users for permitted be should It fe nerrmsaei rmtdadhandled and prompted is message error an After Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8 128 / 82 Inspection Usability Heuristic-based of Automation capable is GUI the if be checks and can actions, validation. recommendation user cross of this and series validation related, a regular logically simulates of that is test data GUI a informa- how through entered understands verified the fully of tester consistency the logical Provided the ensure to data [26]. that tion validate cross Data: should Related system of Validation Cross 6.3/18 attempts login of number limited. the is if period check and time password, short attempting or a actions username in user wrong of a sequence with a at have login attempts can to guideline persistent repeatedly this from with compliance users verifying test its GUI and A system the protect to [26]. order access illegitimate in operations login failed of Attempts: LOG-ON Unsuccessful it Limiting 6.1/6 If appear. fail. event to would message with test confirmation recreated the the Otherwise, be for pass. can wait would and and test simple object, the is an then does, guidelines delete would this test testing A for sequencing. needed events [26]. of chain operation destructive The a of confirmation for action additional Actions: an Destructive of Confirmation User 6.0/18 cannot sequencing. which event scenario, [26]. GUI same on the action relying interrupting recreating without the done require confirm be would notify to automatically first user should guideline the GUI this ask the and Checking then loss data, data of potential loss the in of result user would the that manner a in sequence transaction Interrupts: from Protection 6.0/5 with tested that be to can behaviour back system brought described be This should the invalid. user field, is the format address sequencing. of address event the attention the leaves as The cursor long email. as the a field the moment sending checking the start before context, can mistakes transmission computer possible data of particular notified this be In should users and [26]. format, message and content its ing Checking: Address Automatic 5.2/14 specifying by action results. search search the the examining initiating then directory. with and address start address, The an would an name. of guideline to partial part a this apply handle test can only queries to to search needed if have sequence user-friendly not more be does would guideline operation search this Any angle, [26]. names contemporary partial a with From search the allowing by them assist should Search: Heuristics system adequate Directory Structuring for GUI. and Aids the Deriving instance, by 5.2/5 For followed 5.2. being displayed. is of recommendation be sequence this could second means operations a files previous opened and last to transactions, of related list past data a the showing if represent check could could events events GUI GUI of sequence first A Concept of Proof 5. Chapter hnuesaesacigfra drs nadrcoy the directory, a in address an for searching are users When fi’ oe htaue cinwudso h urn user current the stop would action user a that noted it’s If h ytmsol aiaea mi drs ycheck- by address email an validate should system The hnalgclyrltddt e setrd the entered, is set data related logically a When hr hudb aia aeadnumber and rate maximal a be should There sr hudb eurdt xlctytake explicitly to required be should Users Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8 128 / 83 the Inspection readjusts Usability (which Heuristic-based design guideline of as- responsive Automation This the as [161]. from such Applitools benefit concepts as with can such paired and testing performed when sequence visual relevant be in GUI can especially specialize a form is that require display tools data don’t testing the tests of with sistance of form kinds entry [26]. their These data and display the elements, data automatically. of of and layout entry order visual data the the both labels, in Comparing The similar be users. should for GUI consistent the on seem locations should data same Display: that and Entry ing Data for Compatible Form 1.4/24 colon). a as fields) input (such precede symbol (that same labels the all whether with testing end by automatically checked [26]. be allowed can is guideline This entry data that indicating character, special Cue: a Entry with as Punctuation Label no 1.4/18 when events GUI other of of simulated. (among sequence be types screen a to These the test needs GUI the on them. the to elements with above adding GUI interaction for immediately these actual reason placed of no or coordinates there’s labels However, the fields, whether to properties). the instance access of for require check left tests to the visual order to in aligned automatically [26]. always GUI done are the be across can dispersed are guideline fields this data Verifying whenever element input its to label each relate Format:would a Label enact to Consistent sequence 1.4/17 GUI a create to need not does test the the verify with optional) Therefore, (or to scenario. interact required certain starts. or words, to next events test other displayed GUI the be In certain always when corresponding should trigger fields checked mark to system. be a to needed the mark with not the by fields because is visual interface, attributes if it same verify required guideline, the between to this as of have consistent with existence treated not compliance the is indeed does for difference test check are that to the fields "required" needs if Thus, required to only check test form. and to the the optional that test across which note mark a to in for important be manner is possible can It the then fields fields. identifying is input After it optional differentiated, and test. are required automated between an differentiates in graphically done GUI presented a is if form Verifying a whenever fields input [26]. data users required to and Fields: optional Data between Optional differentiate tently and Required Marking [26]: sequence. below 1.4/12 listed event are GUI guidelines a These back system’s of GUI. the the means in from persisted by independently are tool action verified that be attributes testing user can on a GUI is These at mimic focus the end. directly whose to by verified guidelines needing for and catchable applies without accessed is same be test The property the can the of needed state of nature the that start the then If on the actions, based believed user test. is is potential class the it from this in reason independently in assessed the Inclusion by be category. followed either this to it’s by in property then followed belongs Mosier, guideline, guideline and a said Smith such of by that description developed complete applications, Heuristics report or adequate desktop heuristic the partial Structuring Each in Windows and a sequencing. number event Deriving of reference GUI context for its 5.2. need with general the starts without the but automatically in verified be relevant can which heuristics, includes list Sequencing without This Automation for Fitting Heuristics Listing 5.2.3 Concept of Proof 5. Chapter ossetpeetto omtsol easmdthat assumed be should format presentation consistent A o vr nu ed h ae hudb followed be should label the field, input every For neigadt omadltrreview- later and form data a Entering h U hudceryadconsis- and clearly should GUI The Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8 128 / 84 Inspection Usability Heuristic-based of Automation others[26]. the to relation its signify should them of one Labeling: Page 2.5/6 conventions with consistency its data check can of element test unit actions. data automated user any the an additional Provided GUI, for simulating the apply measurement. without on can of displayed is same system American tested metric The an be the or GUI. to European as the a such of in geographically, version presented differ is localized that calendar the a on if depending check instance format for can test automated An [26]. users." to familiar Conventions: conventions and User dards with Consistent Display Data 2.0/4 to need value. no default is a There check automation. to with just verified events be GUI of can sequence fields a data create in [26]. should values entries values default data data of typed provided existence newly those The the user, from remem- the manner to confuse different users to a from in expected not displayed be order be not in should Moreover, It values. fields. default input ber respective their on advance in Values:played Default of add Display to reason 1.8/4 no is there time, all same Since the at them. test. GUI compare the the and to on table, sequence presented GUI the be [26]. a would in test entries used the data formats for distinguish the needed to retrieve data users automatically for to easier possible it is make It to order in rows the cursor from has different be it if Labels: check Distinctive to 1.5/2 examined a be creating to for has need field the entry without first automated the be Only can focus. default events. at GUI cursor of the sequence of [26]. position the field input Checking first the of start the at cursor Placement: Cursor Automatic be can 1.4/28 rest. precede test the that from a fields far then too optional optional), placed as are or such that required violations fields is input guideline knowl- and what detect sufficient ones, as would also required script (such and test input elements, The system GUI fields of displayed automated. different the nature those provided the However, the of of about properties fields automate. arrangement to the edge input tricky visual to the be have the access might have not whether positioning tester also Judging cursor and minimal any entry, to other. before data leads each fields in actually from required efficiency far all the placing too improve recommends to placed guideline order this in forms, ones [26]. filling optional minimized of be context should the other the In to field one from GUI the Heuristics in adequate Structuring Positioning: and Cursor Deriving Minimal 1.4/26 desk- 5.2. on and phones is in study than this applied. extreme of be less always focus is should dimensions the guideline and since the sizes Nevertheless, However, screen tablets. in size). variations screen the to applications, according top GUI the of layout Concept of Proof 5. Chapter hnifraini ipae cosmlil ae,telblo each on label the pages, multiple across displayed is information When hnatbei ipae,tefra fteclm edr should headers column the of format the displayed, is table a When uigdt nr,tedfie tnadvle ol edis- be could values standard defined the entry, data During hnteue sfiln om h oeeto h cursor the of movement the form, a filling is user the When hnpeetn om h U a opsto the position to has GUI the form, a presenting When Dslydt ossetywt stan- with consistently data "Display Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8 128 / 85 Inspection unneces- Usability task Heuristic-based of the Automation make or loss data of risk the [26]. to complicated contribute more not sarily should values default defined the Defaults: Safe 6.0/13 if Checking test. events. GUI GUI re-enacting a of for need series be the a to without through need automatically scenario even performed a be not can does formatted messages properly is text message of assist [26]. the format should message GUI the a the evaluating preparing then test or system, The checking the for by means predicted automatic be providing can by message users the the of parts some if or format, Formatting: it Message Automatic checking 5.1/5 Mosier, and Therefore, Smith by users. report for sequencing. original useful needing the be without in but would described automated help but be is said could scenarios, guideline much certain this GUI, how under how a assess offered on on is can based offered In help automation is if of users. option check amount to help other can no a help the sequencing that provided event On prove with the can automation of element scenarios. sequencing) while relevance an (without usage the if automation different confidence test words, under with other to prove assert offered to needed it’s cannot are is if automation options needed option, events be hand, menu guidance GUI would "Help" sequence user multiple event a different of an of However, sequence that GUI. presence a No the on for GUI. exists check "Help" a representing and on identify icon to or automation button, use to possible is It [26]. GUI the through command an of code HELP: tests. source 4.4/23 GUI the of series inside GUI a grouped perform performing to than and simpler needing organized directly without them well end retrieving back are makes the which messages at application, error messages error Generally, of length tests. the verify can tester A [26]. tive Messages: interface. Error the Brief on No 4.3/5 displayed test. is a element by GUI automatically a detected such be if can verify GUI to the required on is button sequence [26]. or GUI actions menu user iconic possible an different of the existence symbolising The icons with menus graphical present to vised Menus: Iconic 3.1.8/3 an in sequencing. data checked event be of for can need type elements the graphical each without of test for categories automated multiple color of comparison display and unique detection Color a attribute to advised it’s [26]. then users, by Categories: ferentiated Data if for (but Coding events Color needed). GUI without 2.6/26 is of sequence test sequence a the then a by scenario, include Heuristics usage accessed adequate ex- to certain be Structuring are need a and would under not relations Deriving element appear does expected only label 5.2. test these labels the the those if only interaction, checks Since user that a test labels. simulating software automated page between an the consistent write in is to pressed windows) possible among is even it (and pages releases, between relation the Provided Concept of Proof 5. Chapter sr hudb loe orqetadtoa udneb noiga"Help" a invoking by guidance additional request to allowed be should Users uigdt nr prtos hnsadr ausaeofrdt users, to offered are values standard when operations, entry data During fuesdntsaetesm agaeo ehia akrud tsad- it’s background, technical or language same the share don’t users If ro esgssol o aealto etbtsilb informa- be still but text of lot a have not should messages Error fatx esg a ocnomwt endstandard defined a with conform to has message text a If fmlil eaaedt aeoishv ob dif- be to have categories data separate multiple If Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8 128 / 86 that information usable Inspection immediately Usability the Heuristic-based be of should Automation users to displayed data only The needs. Displayed: user Data Necessary Only 2.0/2 user human actual an test. them. of the perspective of performing the think for because required would manually is user done be a should which guideline in this Checking order same the be should items [26]. data of order sequence Order the Logical in Items Data 1.4/27 guideline this verifying is, label a informative manually. how done judge best automatically is to challenging is it Since [26]. abbreviations or terms, predefined default, Labels: Informative 1.4/19 duplicates). be to as not judged such are will they label recommendations (as a similar section highly in this those provided in Therefore, again guidance manually. included measurement. user be verified of verifying be units guidelines should or other labels formats for date in applies included logic information same additional of The usefulness the result, "characters or a characters", is more or representations field "10 various characters", particular have 10 > a "minimum can correctly. characters", information for ten meaning same least significance certain "at the its as because a (such judging automation conveys for and label task label the challenging maximal a a a if behind have verifying semantics fields means input the rather if Understanding but verifying [26]. mean length, GUI not the minimal does on or users guideline to this conveyed with be should compliance information Checking that field, input data a with associated is Length: Field are Prompting 1.4/11 guidelines Such tool. automation GUI this a in by [26]: assess be following to caught the to challenging not in are came currently listed that guideline attributes are a on that such focus and that which automatically, heuristics reference guideline under include the its reasoning these of the with general, description In by starts complete category. followed or heuristic it’s partial each a then either lists, question, by previous in followed report, the which original like applications, the desktop Just in Windows number of manually. context verified general be the in should relevant heuristics, holds list TestingThis Manual for Suitable Heuristics Listing a simulate 5.2.4 or sequence sequencing. a event to create without order to automated in be need script can no test test There’s the the by Therefore, displayed. accessed action. be be user would to data has the entered element on how input displayed password check by the not of should properties password the the Only of content the words, other [26]. Heuristics In GUI adequate input. Structuring the and Deriving of privacy Passwords situations. 5.2. of loss Entry data Private to lead 6.1/5 values guideline default this the of evaluation if the testing Automating by end tests. back GUI system’s be the to in need happen even can not do tests of types These Concept of Proof 5. Chapter 0) seilywe osdrn uoae et hthv ops l oaie esos As versions. localised all pass to have that tests automated considering when especially 10"), h aesue costeGIsol edsrpie res utilize else or descriptive, be should GUI the across used labels The hnamnml aia,o neatcaatrsrn length string character exact an or maximal, minimal, a When hnaue nesapswr,teGIsol nuethe ensure should GUI the password, a enters user a When sln si sntsae tews nterequirements, the in otherwise stated not is it as long As h muto ipae aasol eajse to adjusted be should data displayed of amount The Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8 128 / 87 machine Inspection on Usability such Heuristic-based rely Therefore, of to Automation possible expert. be an might [128]). by it violations verified date, design be color-related future only detect a automatically currently at to (but can learning manually GUI [26]. performed a be colors on should of color tests number of few relatively use a correct contain should The GUI the and interface, the Color: designing when of highlighted Use already Conservative an 2.6/28 if or highlighted, have be to to better needs is element it GUI Thus, be. a user. shouldn’t if au- element a to decide for challenging expert elements for is usability graphical used it a of be because importance can manually the checked versa, determine be vice tomatically [26]. can or attention recommendation user dark, this attract on with to bright have Compliance that to areas changed or be items data can critical background highlighting bright a on acters Inversion: Brightness 2.6/24 recommen- this automated whether determine an to in evaluation validated manual and a on violated. determined, was rely be dation to cannot better therefore users is for it items [26]. manner, data hand of at Such importance task GUI. the current the information the Judging any of of or failure area response, or upper user success immediate the the requiring at for data critical displayed any instance clearly for and represent grouped could items be should they users, for Importance: by Grouped Data should 2.5/16 that task a is expert). user, an the by of ideally perspective (and the manually from performed logically be always ordered the alphabeti- is of list ordered perspective a be the whether Verifying should over prevail list always that should then user list, the [26]. the designer of to perspective the applied Nevertheless, be cally. could principle no if However, Ordering: List is Logical it consistent. 2.1/23 Therefore, linguistically are interface. labels and entire data the displayed across the consistency if grammar check the the manually testing testing to words, rather appropriate other but more In especially problem, checking words. is the This multiple when not in concept. complex expressed is same is the more concept to becomes that refer when that problem noticeable wording the grammatical program and a errors, linguistic though grammatical in Even variations simple manually. checked for not be to check to as has can so consistency manner, grammatical consistent and linguistic a This in elements data same the [26]. users to confuse refer to has structure grammatical Structure: Grammatical Consistent user, be 2.0/15 the should one for which necessary and manually. guideline displayed is this be data check should to which data appropriate which understand more extension is to by it (and machine hidden), not a is GUI for one the which difficult Heuristics to and advance, adequate prove ones Structuring which in may and and anticipated Deriving it display be to Since 5.2. cannot data become which data select and needed to GUI of users the [26]. amount permit hide overload to the not enough If should customizable be quantity user. should data the The on task. burden a a performing for necessary is Concept of Proof 5. Chapter netn h rgteso rpia lmn ota akchar- dark that so element graphical a of brightness the Inverting it hudb ree nacrac oalgclprinciple. logical a to accordance in ordered be should Lists ooigsol edn oeaeyadconservatively and moderately done be should Coloring ncs h rsne aaiesaeepcal important especially are items data presented the case In costewoeGI h aelnusi and linguistic same the GUI, whole the Across Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8 128 / 88 Inspection measured Usability be Heuristic-based manually. of cannot performed Automation context be specific should a tests in of sorts user These a to test. for how automated is on an message by him error inform an and informative how wrong, Judging went what of [26]. user mistake the the notify correct should prompted message error Messages: Error Informative 4.3/1 expert. an by preferably the and performing manually successfully done toward user be end should the hand guiding at in task [26]. is and wording developers user, the software effective the by how by Checking used understood expressions be technical should instance, employed for terms be, the words, not other should In task-oriented. be to has Wording: Task-Oriented 4.0/17 appropriate. more are is the user It whether the decide to drastically. and offered use, change options of and might context information the tasks change evaluate of certain design manually amount expert a of usability After context a machine. have the a to release, appropriate with assess software to updated with challenging checked an is be or use can of options context of availability example the [26]. the another tests, sequence and automated Also, operations, action previous the task. of of display the point the of though each step Even during current options available the previous show affect a only of can to result it the be if instance, would displayed For be transactions. to of sequence has a transaction of consisting task a through Users: going for Context Defining auto- 3.4/1 recommendation. checked this be violates cannot menu order a logical if verify a manually in to presentation has their expert or An matically. options, of [26]. grouping are correct options The its of group a connected logically how to Options: Menu of than Grouping Logical human 3.1.3/22 a by understood better manually. are evaluated button be a should or heuristic menu this a Therefore, on machine. used the a word, at a directed behind questions semantics than The rather commands system represents [26]. that user manner a in Commands: consistently as Worded Options Menu 3.1.3/11 required. that is perspective concept human important a an because is manually responses checked system be and [26]. should actions user natural between seem compatibility should the Judging and expectations, user with compatible manner a in Expectations: User with Compatibility 3.0/16 improvements some if or minimum, a to kept really are possible. on actions still rely user are to if simpler manually is user’s check it automatically, the to conduct expert Heuristics to with an adequate difficult is appropriate Structuring recommendation and being this Deriving of also verification The while 5.2. sequence actions, transaction user [26]. the of experience in and employed number abilities logic minimal The a user. contain the from should response quick a needs performed Actions: User Minimal 3.0/2 Concept of Proof 5. Chapter oto cin hudfvu ipiiy seilywe h task the when especially simplicity, favour should actions Control h ae odn,pope esgs n srguidance user and messages, prompted wording, label The h otx fuehst emitie hl h sris user the while maintained be to has use of context The ncs h ytmietfisa ro aeb h sr the user, the by made error an identifies system the case In euhst epeetdi a htrelates that way a in presented be to has menu A h eut fa prto hudb displayed be should operation an of results The euotoshv ob oddceryand clearly worded be to have options Menu Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin8 128 / 89 Inspection Usability Heuristic-based of Automation expert, the for time some tests). of if manual (but lot the interface a only an save him of can leave usability would they the that implemented, on is since are change professional (as tests design usability manually evaluation a a done guideline of of effect automated be overall expertise guideline should the the broad that assess Nevertheless, this those to Therefore, encapsulates needed also section). report. and guideline this the automated throughout particular in be described this guidelines at was can that other interface that assume the an heuristics of to of inner all) reasonable usability includes not is data the (if it a decrease majority as might words, vast par- classified change the other In is design In interaction. a it user-system level. if though of any detecting Even areas on report. functional focuses the six it in all ticular, scopes covers largest it the recommendation, control, of protection sequence one display, has data guideline entry, [26]. This data protection of data areas and transmission, the data assisting guidance, functions user weaken or hinder not Change: Design from Protection GUI 6.5/2 by catchable) (or observable directly property expert, a usability not a is by elements assessed GUI tools. be of testing should use difficult of or ease easy the is since action unau- LOG-ON from the system whether the Judging protect to needs the [26]. balancing while access the thorized all possible, as much as simplicity LOG-ON: Easy 6.1/1 expert. an by manually performed attributed needs be cognitive should user’s the hand, with at fits task it or the whether frequency and to difficulty high or ease with task actions of include evaluation user should The loss [26]. example, data confirmation For in user result requesting could be. step that to additional actions needs while an easily, it performed be as should difficult priority Actions: as high or User easy of as Difficulty made or Ease Appropriate documentation. the 6.0/8 in or an section if help check the cannot in explained test sufficiently automated and an properly since is manually, message the checked error be by should returned guideline be this could Naturally, that message error possible [26]. every system of explanation detailed not documented is Messages: automation Error evaluate, Documenting 4.3/12 to expert one number for the high Unless too irony). deemed or messages is humour error detect system technical to a needed. machine how in a to messages for (due is error manually it of challenging tests how classes. theses used and neutral perform techniques be, and negative to learning could positive, machine preferred into not from is used and benefit wording it Heuristics user, the However, can adequate categorize the messages Structuring to blame and error analysis, Deriving not in sentimental irony, in text 5.2. contain the not of funny, evaluation be The not should [26]. text the computer say, the personalize to is That tral. Messages: Error for Wording Neutral 4.3/6 Concept of Proof 5. Chapter ogn nt h plcto hudb eindi anrta favours that manner a in designed be should application the to in Logging n oicto aet h eino h U should GUI the of design the to made modification Any srgiac hudicuetepsiiiyt iwa view to possibility the include should guidance User h eto ro esgssol lasb neu- be always should messages error of text The h opeino aksol be should task a of completion The Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin9 128 / 90 be to add-ins up. .Net starting and is tool web Inspection UFT Usability the the Heuristic-based time requires of each it Automation loaded be words, to them other dependent require In is not turn does it in [208]. but which add-ins UFT, add-in, with .Net WPF installed and the require web Presenta- Windows scripts the .Net test on the its of Thus, framework (WPF). GUI Foundation [208]. the elements tion using GUI built displayed is the Application identify MyFlight Java cannot Similarly, on be tool centered UFT must tests the add-in GUI else Java automated or UFT any objects, the executing GUI before language, the environment programming developing Furthermore, Java the the in it. using loaded built around is built GUI cases Eclipse’s test access. Since Internet of require portability not require do the not tests improve does GUI application to MyFlight this settings UFT, its with pur- experiment main to to software the changes testers in Since offered encouraging searching. features is and typical MyFlight browsing, mimics of to booking, pose It how as tests. learn such API systems, testers and reservation software GUI for helping developed automating of for intention tool the UFT with simula- the developed reservation use was flight It simplified test a UFT. is with good Application built-in a Sample tor MyFlight GUI software, second charge its the of makes for free As available which is features, (EPL). It License on tool. Public UFT rich enthusi- Eclipse the by the and with source built under violations flexible is open guideline is usability active even It checking IDE globally and for Eclipse developers. subject a companies, The software of small by contributions [207]. some the IDE asts by corporations, used supported widely big continuously most counting is the community, and of Foundation, one Eclipse is the Eclipse (version all, is developers which of Java Focus, First for Micro and Photon [152]. HPE UFT Eclipse by with 14.03) are incorporated (version default a These Application by system in Sample operating MyFlight violated Windows upon. and is the Any experimented [206], heuristic under 4.8) be given run test. that a to under programs if selected software applications checking two are of as end, purpose serve that the To to GUI. has needed specified UFT also under written are script programs testing software additional fact, and .Net, In Java, ActiveX, was Basic, optimal GUI Visual support whose as Automation. that such applications UI add-ins technologies, Microsoft with and of working languages, variety frameworks, when a worked on offers version identification built it The object therefore features, executed. is reliable installed and It and its written performance are among market. scripts and the is testing 14.03, on the and is applications which Focus, with Windows under Micro for tool by tools the on automation as later selected test and leading (HPE) the Enterprise of [152], Packard (UFT) one Testing Hewlett Functional by Unified developed is 4.1.3, was section which in presented tools testing GUI the Among Environment Development Needed Preparing the words, verify. to other guidelines 5.3.1 In of list violations. a guideline and test, of to as verification GUI serve the a to scripts. in tool, automation well testing use an as the to requires execute required tool experiment to are testing needed the applications the also to software for is Windows belong ground tool of all section automation instances in These test listed some the GUI and experiment. Furthermore, at adequate derived automation (previously described an practical sequencing some was Thus, event a (which GUI Therefore, 5.2.2). in for Mosier tested suitable and heuristics. heuristics be Smith of some to process, category by of inspection 5.2), derived usability evaluation section list a of the the improving start automating from of of selected capable is are capable tool guidelines be testing should GUI tool a that the Tool prove Testing Tool Testing to GUI a order a in In Evaluation in Heuristic Evaluation of Development Heuristic of 5.3. Development 5.3 Concept of Proof 5. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin9 128 / 91 Inspection Usability Heuristic-based of Automation Tool Testing a in Evaluation Heuristic of Development 5.3. Concept of Proof 5. Chapter tdb F 21.Sm etsesaedsge ob esd(rrcle)drn h execution the during recalled) (or reused be to designed gener- are diagram flow steps convenient test a Some in displayed [211]. be can UFT executed actions by is called ated actions such [210]. sections, which language) logical in scripting multiple order VBScript into The the divided in functions be of regular can size to UFT the similarly in minimize work and (which created effort, tests required all the diminish folders, tests, test the understanding what facilitate describe to order briefly In common. and in in- terms, have heuristic testing might automated UFT-specific tests individual these some on of explain most focusing to order, before needed same However, first the it’s table. in spections, discussed the are in guidelines appearance 5.2.2 The their experimentation section examined. as the in is section, discussed evaluation this briefly Throughout heuristic already section automated guideline. was particular with of the 5.1 beginning of table the description in a in guideline by explained sequence preceded each display, all of data were automation entry, which The Data protection, are: 5.2). fifth data tested the and except from areas guidance, Mosier The selected and user is transmission. Smith control, of heuristic data work one on the focuses least by partly balance covered which At is to areas area functional applications. need application different the two which the on the of on based one between partly test each equally and to GUI, tests guideline the of which on number under applicability of the applications guideline choice two of The degree the the in 5.1. on UFT, based table with in upon (GPLv3) stated experimented License are Public be General test, to avail- GNU guidelines publicly the is selected under published below The is detailed and [209]. tests [1], the platform GitHub all the automating on Tool for able Automation code Test source GUI developed a complete in The Evaluation Heuristics Usability of Instances 5.3.2 in time delay the specify and Finally, button, Runs". radio "Test "Normal" then the tab, Once check Testing" milliseconds. to item. is "GUI advised menu the This it’s "Options" select Mode", the to "Run script. clicking possible under test it’s and the same opens, list, having window menu the by "Tools" option within execution the the the steps selecting with, down by up programming slow keep UFT individual to to in eye between possible achieved human is time the it for some fast comprehend, wait too to executed tool observer being an are steps for test thus, the and case in note, side a On eeec [26] Reference 3.1.3/7 2.7.5/3 6.0/18 4.3/13 3.5/10 3.2/10 3.0/20 1.4/15 1.3/10 1.0/4 6.0/5 3.3/3 3.0/1 1.7/3 al 5.1: Table xlctTbigt aaFields. Data to Tabbing Search. Explicit in Equivalent Case Lower and Upper Response Fast [26] Report Original the in Title Guideline srCnraino etutv Actions. Destructive of Confirmation User Interrupts. from Protection Error. Following Placement Cursor Actions. Control Reverse to UNDO Option. CANCEL Offered. Options Available Only Entry. Keyed by Selection Menu Lockout. Control Indicating Control. Sequence Flexible Windows. User-Specified Messages Error Non-Disruptive sblt udlnsslce o rcia experiments practical for selected guidelines Usability ylgtApplication MyFlight IDE Photon Eclipse Application MyFlight IDE Photon Eclipse IDE Photon Eclipse IDE Photon Eclipse IDE Photon Eclipse Application MyFlight IDE Photon Eclipse IDE Photon Eclipse Application MyFlight Application MyFlight Application MyFlight Application MyFlight Tested Application Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin9 128 / 92 Inspection self-explanatory Usability nine Heuristic-based contains of It MyFlight Automation 5.2. "Common figure of called in GUI is shown the test is diagram This around flow built case. UFT cases test its and test one Test" under more Behaviour grouped or all two are among application, common MyFlight are which actions test Secondly, 5.1: Figure IDE). Eclipse actions, the Application" restart Display "Start (which could to and Eclipse Application" it Setting "Close restart that "Change the so called would to change action calls it would contains nested turn, then a action in found. and has which the Launcher", action not startup, Application" startup, was during "Start it Eclipse appear the if window on why explains again that prompted have it created not to create newly is settings and the window one the workspace, for pre-launch the current look the not the would is if of action prompted Finally, folder Application" workspace parent "Start the the the from if in in efforts execution, workspace script testing opened test interferences the prevents later last from created, This a environment the was On testing as which run). the folder Eclipse factors. isolates first unforeseen parent and some the same work, by is then the existing it in open, potential if located to with one workspace be interfering default which it the have user as the and (or the asks workspace, workspace started, that new is Eclipse window a Once pre-launch creates application. the it the for it of wait that path so would installation "Start variables, the action called environment representing action system’s string test operating first a the The form from IDE. software, can the name Eclipse closes user the it the finally starts extracts and it it, Application" summary, test deletes In then eclipse project, general actions). Java a a nested describes creates (excluding It of actions Test". view reusable expended Behaviour an Eclipse four alongside "Common with 5.1 named figure is in actions, presented is nested diagram its are flow whose but externally, test, violations, actions general their first heuristic The calling for reusable by check code of not duplicate only avoid do can composed that cases needed. ones case test whenever other only test that the general so are present a applications cases simply two have test the application of MyFlight) two an one these and closing each actions, Therefore, and Photon respectively. starting test (Eclipse be the test would of end tests under the all at by and beginning needed are the actions at and of sections examples Tool logical of Testing a into couple in together A Evaluation [210]. grouped Heuristic actions be of reusable can Development external elements UFT 5.3. test in Such called tests. different many of Concept of Proof 5. Chapter F o iga o eea etcs nElpePoo iha xaddve of view expanded an with Photon Eclipse in actions case nested test its general a for diagram flow UFT Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. r 2] lo h rtdt nr udln ob etdi ild"atRsos" saforemen- [119]. usability As on impact Response". high "Fast a en- titled has and is data valid, tested is still be is Mosier guideline to this and guideline section, Smith entry previous the data to in first according tioned the interaction, Also, user-system [26]. of try area functional first The Guidelines are Entry diagrams Data with additional Experimenting the all corre- where its A alongside appendix presented in not viewed is simpli- be that but grouped. can diagram practice. are flow in description, tests automated UFT these test be any of sponding really all sections, can that evaluation following heuristic mind the whether in Across guideline check keep for to to checking instances important tests problem It’s the fied discussed, examined. been be have cases can test violations among actions common the Since uoaino ersi-ae sblt npcin9 128 / 93 Inspection Usability Heuristic-based of Automation it. deletes then and results, search the from it the selects name down. order, passenger entering selects shuts the booking includes flight, for application last searches a the (which Finally, test the for order the on window, searches flight initial entered it the particular was to that, that Toolthat back After Testing goes books a it in. in and Once Evaluation logs name). results, Heuristic passenger it of search then Development the application, from 5.3. the one starts it First, actions. Concept of Proof 5. Chapter iue5.2: Figure F o iga o eea etcs nMFih application MyFlight in case test general a for diagram flow UFT Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin9 128 / 94 Inspection Usability Heuristic-based of Automation only has action that because shown is is That time. it response action. where system’s login same test, the a the the measure to during of within is diagram second occur which flow one search focus, UFT than the one the less and displays in login 5.3 reacts the Figure application that MyFlight operation. the search UFT of a the Therefore, GUI and system. the operating if the checks by handled test typically is text typed displaying Nowadays, responsiveness testing when consider to [119]: limits [205] following time the three modern are average still which the are for operation there computer normal Nevertheless, mainframe contemporary sur- a a not Toolcomputer. on from Testing operation different should normal a greatly feedback a is in considered 1980s displayed Evaluation is the in Heuristic what during say, of delay to Development the Needless operation, seconds. 5.3. 0.2 normal pass for that developers instructs It Concept of Proof 5. Chapter n ) tews,tets eot alr adeeue ie 0ad11). and lines 10 with lines if-branch executes the (and thereby failure (entering a that reports success If test a the 7). reports Otherwise, line test in 9). the happens between and second, (which difference 8 login one The a than 6). of line less duration in is rounded-up here confirmed value the (seen is time to result ending corresponds expected the timers the logs two with timer the associated another occurs element 5), (which GUI line 2), form in one the and as of 5.1. submitting 1 (such before existence listing lines right the 3) in in Once line demonstrated seen in 4). (as seen be line fields (as in can text point starting test the the to the logs entered timer of a are part then password login and the name agent performing the for First used sequence event GUI The 2 1 ֒ ֒ pWno P ylgtSml pEi " ( WpfEdit . ) " n o i " t ( a c i l p WpfEdit p A . ) " n o i Sample t a c i l p p A MyFlight HPE " Sample ( WpfWindow MyFlight HPE " ( WpfWindow • • • → → e eod r h aiu hntega st epteatnino sr(scnhappen can task). (as a user processing a finish of to attention the computer keep the to for is waiting goal when the when maximum the as are (such seconds broken Ten be not should data). thought of editing flow is user’s user the the when when limit upper instanta- the characters an is the perceive second and One to text user typing the between for passes screen). is that the goal on time the displayed the when being as limit (such the reaction about system is neous second a of tenth A t"jh " john " et S 5b161e1174b" b51f6f18e41810734bb "5 e r u c e S t e S iue5.3: Figure F o iga o udln ./ Fs Response" "Fast 1.0/4 guideline for diagram flow UFT gnNm . ) " agentName asod"). ) " password Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin9 128 / 95 Inspection Usability Heuristic-based of Automation 5.4: Figure Tool Testing a in Evaluation Heuristic of Development 5.3. Concept of Proof 5. Chapter aei ahieain n hsdsicini nyi em fupradlwrcs etr (being letters table). case data lower global and a upper in of stored are terms which in "jaNE", only during and passenger is one "JANE", different distinction a expected "jane", this with the "Jane", times and is four iteration, result repeated each is actual test in the The name with if Details". action checks compliance on Order the it "Display order for called booking During then action a GUI application, the of test. MyFlight a operation performed the search test the of a to simulates of GUI script the practice diagram the flow good Differently", frequently. Order UFT less a Same the resurfaces the still shows "Search problem is 5.4 this it Figure so and guideline. to, deal- disappear, this advised consider not being nowadays did without between developers differentiates on it GUI system However, early Most the issue letters. if this case check with lower would ing and test Equiv- upper this Case with implies, Lower queries name and "Upper search the called As is upon Search". experimented in be alent to guideline entry data second The 13 12 11 10 9 8 7 6 5 4 3 itn 5.1: Listing ֒ pWno P ylgtSml WpfButton . ) " n o i t a c i l p p A Sample MyFlight HPE " ( WpfWindow n f I End → r e m i t = d e t r a t S n i g o l pWno P ylgtSml WpfComb . ) " n o i t a c i l p p A Sample MyFlight HPE " ( WpfWindow f I n f I End oiTm d e h s i n i F n i g o l = loginTime r e m i t = d e h s i n i F n i g o l e s l e oiTm Then 1 < loginTime f I rmiy").ExistThen t s i x E . ) " fromCity ֒ ֒ → → eotvn iFi oi epneTm n o l s i Time Response Login " , micFail ReportEvent . r e t r o p e R eotvn iPs oi epneTm s e l s i Time Response Login " , micPass ReportEvent . r e t r o p e R F o iga o udln ./0"pe n oe aeEuvln nSearch" in Equivalent Case Lower and "Upper 1.3/10 guideline for diagram flow UFT eod","Teue lwo " d e t p u r r e t n i n u s i t h g u o h t of flow s ’ user The " , " second a eod","Teue lwo ih eboe " broken be might t h g u o h t of flow s ’ user The " , " second a n a h t oeSiptwt h U vn eunevrfiglgnrsosvns [1] responsiveness login verifying sequence event GUI the with Snippet Code − d e t r a t S n i g o l O").Click c i l C . ) "OK" ( Bx(" ( oBox n a h t s r e g Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin9 128 / 96 not is Inspection Usability GUI which Heuristic-based the date of If past, erroneous Automation an error. the moment his the of in meaning notified occurs, date is mistake user a the the moment using which the in message flights moment error exact for an this the displays checking search for test checks to The it tries Then Messages". allowed. GUI Error "Non-Disruptive MyFlight’s with named conforms on is elements guideline guideline of entry order data inter- tabbing last most the The is if user check user. the the to the elements for written in merit GUI than be seen logical which rather can their be in importance test can knows of automated As tester order in an the color). go Provided ested, red window appearance. Steam a of the in of order enclosed elements in are seventh (which and corresponding elements sixth numbers contains the GUI window figure, of Each order being left. tabbing applications, the two on the of [212] to to windows Steam login top impor- and the most right, from shows next the and 5.5 the on Figure to MyFlight right, user. tabbing to the after to the left move element to to from interactive appropriate move tant starting more would sometimes, appearance, focus is of cursor it However, order key, receivebottom. in tab elements the element GUI pressing GUI which after interactive in Generally, next order the tabbing. evaluate after to focus further cursor pushed be can recommendation This 5.2: Listing further follow is which to 5.5, easy figure is of next, half The the right one well. to the as on If element pass components below). graphical not line). GUI described one does numbered last from the test the moves by the in is focus exemplified pass, (shown (as button cursor not button "OK" the does that code the which source on of in the click order property in 8). a focus presented line by cursor checkpoints (in followed the the executed 9), that of is line confirms event in checkpoint tabbing focus. next final (demonstrated cursor the has the true and currently step, line field 7) last in text line (done a password by (in press In the key entered 3) if tab is line 6) second password test in line a the the by shown (in Afterwards, followed Then, checks (as 4), checkpoint field line one. second in text The expected (seen first predefined in 5). name the agent the (performed the on event with enter is tabbing placement to a position cursor proceeds cursor’s of current the simulation actual the if the Listing After confirms comparing guideline. test this action. performed the violate login all not 2), the were does line of flight, it content since a test, the ordering this shows and passed 5.2 flights, GUI for The position searching navigation. cursor’s Logging-in, tabbing the checks with entry. it key terms, simpler tab then in key, a tab or the focus, Toolafter pressing Testing input simulate has a application element guideline in MyFlight graphical entry Evaluation on which data Heuristic this checks third checking of the test Development is The Fields" automated. 5.3. Data be to to Tabbing "Explicit titled guideline the Furthermore, Concept of Proof 5. Chapter 10 9 8 7 6 5 4 3 2 1 ֒ ֒ ֒ ֒ ֒ pWno P ylgtSml WpfButton . ) " n o i t a c i l p p A Sample MyFlight HPE " ( WpfWindow pWno P ylgtSml WpfButton . micTa ) " n o Type i . t a ) c " i l n p o p i A " t ( a c i l p WpfEdit p A . Sample ) " n o i " Sample t ( a c MyFlight i l p WpfEdit p HPE A " . micTa ( ) MyFlight WpfWindow " n o Type HPE i " . Sample t ( a ) c WpfWindow " i l n p o p i A " t ( a c MyFlight i l p WpfEdit p HPE A " . Sample ( ) WpfWindow " n o i " Sample t ( a c MyFlight i l p WpfEdit p HPE A " . micTa ( ) MyFlight WpfWindow " n o Type HPE i " . Sample t ( a ) c WpfWindow " i l n p o p i A t a c MyFlight i l p p HPE A " Sample ( WpfWindow Sample MyFlight HPE " ( MyFlight WpfWindow HPE " ( WpfWindow → → → → → a key Tab e h t g n i s s e r P s e t a l u m i S ’ hc hcPit("Pswr a usrFcs") " Focus Cursor has d l e i F ) Password " " ( Focus CheckPoint Cursor Check has d l e i F AgentName " ( CheckPoint Check hcPit("KBto a usrFcs") " Focus Cursor has Button "OK ( CheckPoint t"jh " john " et S 5b161e1174b" b51f6f18e41810734bb "5 e r u c e S t e S oeSiptwt h U vn eunecekn abn aiaini h login the in navigation tabbing checking sequence [1] event window GUI the with Snippet Code gnNm . ) " agentName . ) " agentName asod"). ) " password . ) " password O").Click c i l C . ) "OK" ( O").Check . ) "OK" ( b b b Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin9 128 / 97 Inspection Usability Heuristic-based of Automation Since IDE. the closing entirety. its finally in before passes test this Java setting, also the minimalistic of default perspective guideline, the the Java name its by deletes old the to abides test the obviously shows layout the Eclipse figure, it the Subsequently, the the restores side, how (default)"). on shows and right "Java observed 5.7 called perspective, the be is Figure can on (which As while perspectives. exist side, perspective offered rest. still left all created the with of the this listed list on if perspective the like verifies among looks test choice layout the new described Afterwards, "Minimalistic a called as perspective does. added custom test a is the the as and saved which a Explorer", be "Package closed), Perspective", then the were Java can "Ant", views layout named the are particular all which This (meaning and recalled, "Console". are editor add views the to Three for possible starts. except is test empty new it window. completely as same is the long layout on as the simultaneously passes Once then displayed the test time, be same the in to the views, of displayed at of portion window are kinds same This different views the remove on more views. all exist all incrementally still Once closing window, tabs earlier by view. same the proceeds different the if checks a on test to the displays layout, corresponding script same each test elements, the and action Perspective" tab all, main "Save GUI the of called test, actions First such nested This of more Window". two diagram "Reset interface. flow holds and UFT same Window", be the the can "Customize representing Windows" named on 5.6 "User-Specified is data figure guideline the in of checking seen variety if As testing a automated. for viewing candidate of ideal an possibility it the makes offers IDE displayed. precising Eclipse be and window, The to special expects a the he creating by user content accessed a of be of kinds actions to different the viewed need the simulates be data script can As the such data while, displaying of the them. elements types all of multiple graphical test, if the one check of tested is automation- to properties Windows" be order its the only "User-Specified in together, section, can of called previous them most heuristic the of in The though some described Even tests, already sequence. their event studied. for GUI sequencing be a event with to need area not do functional guidelines second friendly the is display Data Guidelines Display Data by with abides Experimenting application MyFlight since message case error this the in displays does only disrupts it and it which user because passes, the guideline. fail, test to this would the time test Then, enough the gives submit. Then, GUI on the pressed. if is However, button user. submit the the before and selected is 5.5: Figure Tool Testing a in Evaluation Heuristic of Development 5.3. Concept of Proof 5. Chapter ieb ieve ftetbignmrcodri h oi idw fSem(left), Steam of windows login the in order (right) numeric application tabbing MyFlight the and of view side by Side Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin9 128 / 98 Inspection Usability Heuristic-based of Automation 5.6: Figure Tool Testing a in Evaluation Heuristic of Development 5.3. Concept of Proof 5. Chapter h hl"io) hnpoedn ihteoeainta a nerpe.Ti atclrscenario pressing particular by This as interrupted. IDE. (such be was Eclipse hand would that the at operation in it task the tested where the with is action, to proceeding relevant then simply sign-up be icon), guidance the would "help" for example to the asking Another return operation, stopped. can certain he where through, he a process midway starting fails, the stop that continue then would Once he action, supposed that sign-up action. a expected a on sign-in to instance, start needed a For initially is try can tested. professional and be user usability better a a users should window, of that that login experience paths operations hypothetical the of sequence end, but sequence the that coverage exact To checking code the to through. specify increasing reduced go or be to likely can defects most test code are the uncovering since test- issues, However, at scenario usability to 3.1.3). aim identify manner section not rather similar in does a explained evaluation in technique verified heuristic testing are a specification-based take as a might automate, to user contains is WordPad challenging the (which Microsoft be paths ing can end as was flexibility to simple As the end as Checking possible application [43]. [2]. the an operations combinations of GUI sequence GUI possible possible 324 a oper- over even of GUI 1.1, number of section high flexibility to in a testing guideline described Generally, through first its going Control". and requires Sequence control, "Flexible ations, sequence named is is interaction automated system-user be of area functional third The Guidelines Control Sequence with Experimenting F o iga o udln ../ Ue-pcfidWnos iha expanded actions an nested with its Windows" of "User-Specified view 2.7.5/3 guideline for diagram flow UFT iue5.7: Figure nisac fa cis utmJv perspective Java custom Eclipse an of instance An Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin9 128 / 99 is which diagram, flow Inspection respective Usability Its Heuristic-based of operation. Automation ongoing the with line in states, button are "Order" they the observes if guideline this verifies checking and automatically test the mind, in this With 5.9: Figure in failure of risk a carries be- and pressed violation, progress be guideline the to a and were constitutes again, errors. button behaviour on preventing the carried This be be- if reset. would button Therefore, would operation "Order" order bar enabled. progress booking the the still the while and is invisible, words, window, button becoming other the fore "Order" In of completely. the filled active, bottom is is the bar bar progress at state instantaneous. the final bar not after and is progress only third state invisible state a its third comes enabled the displays to passenger the to change first enters the second would GUI the button as it from the The clicked, long change then is the as empty, However, button continues invisible. not the become this is once and Finally, field disabled, text pressed. be be that to can If figure is and in empty. state shown is button are field on through default text present passes name’s and is button first which this that button The changes "Order" The the 5.9. details. on order focuses check- displaying application, script page MyFlight test the the The on Lockout". heuristic Control this "Indicating called ing is guideline control sequence Another 5.8: Figure guideline the with compliance for the checking passes. in only it actual viewed is be Control", any test can Sequence provide as automated "Flexible not found", this does not since "Topic testing However, message issue during the 5.8. usability called displays figure different simply window a it help uncover since The users, helped for accidentally checking. guidance IDE was Eclipse it the one of the guideline. test the than this with note, complies sequence side GUI proceeds operation a the and the of On break it part not closes that does it then guidance window, request, user help help for the asking the initiating If of project. existence "Next" the the the creating clicking confirms is continue after script which to (accessed mark) test Project" Java question the particular "New the a Once one window of with the IDE middle button). (symbolised of the Eclipse page icon in second the Help" then the Path of project, in Build new provided case "Java a In the creating presses by enforced. it starts be Tooloperation, script Testing the can a evaluated, scenarios in was Evaluation of scenario Heuristic challenging usage variety of be Development A can Control" Sequence automate. 5.3. "Flexible guideline to the with compliance if checking test The Concept of Proof 5. Chapter h he ttso h Odr utn(dsbe"o h et eald ntemiddle, the in "enabled" left, right) the the on on ("disabled" "invisible" button and "Order" the of states three The Flexible 3.0/1 guideline testing while prompted Control IDE, Sequence Eclipse the of window help The Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. utn ntepope eeincnrainwno.Smlryt h rtato,ti portion this action, first the well. to as Similarly passes test window. the confirmation of navigational deletion the Project prompted pressing "Delete after the action button on "Delete" the and the Secondly, buttons of appropriate, state passes. whenever the test checks disabled the availability)" checking of are (while portion "Cancel", this Therefore, and the buttons otherwise. of These "Finish", enabled availability availability)". performed "Next", checking the steps (while "Back", different verifies Project labelled the test "Create with being titled The accordance action in non-reusable represented seen. window first is be Project" the IDE Java can in Eclipse "New diagram the the flow of in UFT GUI buttons its the main on where it 5.11, with figure compliance control in checking sequence test the to A belongs area. also Offered" functional Options Available "Only guideline be and the can Furthermore, repetitive Entry" most Keyed sequencing. its by Natu- event for Selection GUI keys "Menu to interface. guideline shortcut thanks the the keyboard automated of checking offers the Therefore, behaviour IDE action, features. Eclipse the each after the validate sought performing as to After passes elements test Afterwards, GUI presses. this saved. key of rally, and only properties made with the is similarly accesses class press Java deleted, script key new is a a Through project then IDE. whole created, the Eclipse is the if the project check on Java and performed new im- presses was a name key test combination, simulate the Said would As achieved. guideline was Entry". this outcome Keyed checking desired by test Selection GUI automated "Menu an titled plies, is heuristic control sequence Another uoaino ersi-ae sblt npcin10/128 / 100 Inspection Usability Heuristic-based of Automation order booking additional being the after stop fails enabled not test still did the was application guideline click. button the second this "Order" because a with the "Order is after because comply error action is second not the error the does is and first GUI test Toolclicked, The the Testing this errors: a Since in two in action Evaluation returns Row". Heuristic non-reusable and a of only in Development Twice the Flight 5.3. that show Same 5.10, figure in conveyed Concept of Proof 5. Chapter iue5.10: Figure F o iga o udln ./0"niaigCnrlLockout" Control "Indicating 3.0/20 guideline for diagram flow UFT Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. re ocoeteeio a ihu aig hslasElpet s h srntol fh wishes he if only not user the ask to Eclipse then leads code, This a some (Include saving. contain without Program it tab "Write have editor called class, the is Java close "HelloWorld" test to new this the tries a in in creates action operation script new cancel the only a where The of Cancel)", feasibility simple. event the rather with checking is automation test IDE The for Eclipse Option". suitable "Cancel judged titled heuristic is control sequencing sequence additional an Furthermore, uoaino ersi-ae sblt npcin11/128 / 101 Inspection Usability Heuristic-based of Automation an in strings empty for search available the is button Therefore, "Search" error: while found. the workspace. button why being returns empty explains the and matches which Pressing Offered", fails, no Options test enabled, set. Available in automated "Only is not the results guideline is button the always search search violates empty the Eclipse the in of is empty, operation scope field being the button text field if "Search" search text disabled the the becomes of search availability it the the while, Despite how the GUI. of Toolall idea Testing the a an by gives in operation handled Evaluation 5.12 search Heuristic Figure is a of initiates Development workspace. Project", Empty empty "Search 5.3. an action on non-reusable last the hand, other the On Concept of Proof 5. Chapter iue5.12: Figure iue5.11: Figure h viaiiyo h Sac"bto nte"erh idwo cis Photon Eclipse of window "Search" the in button "Search" the of availability The F o iga o udln ./0"nyAalbeOtosOffered" Options Available "Only 3.2/10 guideline for diagram flow UFT Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin12/128 / 102 in moment Inspection the Usability on Heuristic-based of not Automation is message test error the the after of placement the cursor’s focus while the the on flight time, rather a but this dismissed. for "Non-Disruptive prompted is However, searching is guideline message is entry past. error which data the the the reproduced which in of being test is is the date error to selected same similar The Flight very a "Search is Messages". is test here Error action This non-reusable Error". only Date". Following guideline, The Placement Past The 5.14. "Cursor figure with called in [26]. shown is guidance is automation diagram user with flow UFT tested is Its be guidelines to Mosier’s area, and this Smith to belonging in area functional next The Guidelines Guidance User with Experimenting performing test automated option. the Menu thus, redo and "Redo" and undo guideline, the these the passes. time performing with operations Successfully this the a complies the clicking instruction. GUI As Finally, undo the by last means editor. option. again, the operations the Menu of reversed from effect "Edit" is removed the the editor. is operation cancels that under This class undo in item "HelloWorld" the made the Menu changes of to "Undo" last added result the the last "HelloWorld", pressing undo method called by to the Class done Java proceeds result, a is then creates step panel, script last editor test the This the in end, program that on To a checked IDE. be types Eclipse also can the Actions" of Control GUI Reverse to the "UNDO heuristic control sequence the Lastly, 5.13: Figure element GUI a of presence the testing requires only testing it automating say, for option). to cancel needed is the effort to (that the corresponding minimal GUI’s since a is checking applicable, heuristic "(No)" neglect when to text this not the guideline advised to this is the next it with button, in Therefore, again" compliance to with. seen "Try connected interacted be a not be is only cannot can system naturally provides instance operating which message the strange notification while The rather started is Internet. a login application the The Also, Netflix button. the cancel When cancelled. a and 5.13. be contain figure films Netflix to cannot of seem as operations library not huge such sign-up does store). a application and heuristic, Microsoft to whole this the the access However, providing violates from [213]. service, directly that series streaming installed programs subscription-based and find a downloaded is to guidelines be Netflix intuitive rare can most not (which the is of application one desktop it seem However, might of operation passes. existence follow. an test the cancel to the verifies to guideline, test choice Tool the The Testing the with tab. a users complies the in Providing IDE closing Evaluation Eclipse cancel Heuristic the to of since wants Development and he button, if 5.3. "Cancel" also the but save, not or save to Concept of Proof 5. Chapter cenhto h eflxdstpapiainwe tre pwtotItre con- Internet without up started when application nection desktop Netflix the of Screenshot Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. n eot aktesceso alr ftets codnl wihmthsi oelns1 to guideline. 10 the lines by changes, code abide his in fact save in matches does to (which GUI user accordingly the test the since the offering passes, of Finally, appeared test failure window 9). The or line warning 15). success in a the seen result, back be a reports can (as as and closed 7). if, line is checks to tab corresponds test Subsequently, class (which "HelloWorld" the the 6). entered that then is line action, console 5), in on interrupting line done World" an is "Hello in As text (which shown the focus (as display cursor again to given code selected text Java is is the editor the (as project window, the clicked sample is in introduced button the tab newly "Finnish" that, class the the "HelloWorld" Following and In 3), 4). line line in 2). done in line (as seen creating field in the name for encapsulated the by responsible in all followed window entered selecting is the is selected, "HelloWorld" by (which prompt is project to class "File" sample Java order option the in new menu on "Class" a main put option the the this is Afterwards, then focus in "New", 1). cursor action sequence sub-menu line First, event main in GUI 5.3. program the performed concrete listing diagram, described is Its (as in the it Saving". the case shown Without on this of is it seen form in Interrupt diagram be code and (which flow Program in can lost The a As "Write be called will class). 5.15. aims is Java data figure test test unsaved in the the angle, represented that in this is user code From behaviour the added event. notify work. the a interrupting can the by create the saving GUI represented tests is before the both tab editor if the the words, verifying in closes other at tab that In class step that the Option". close particular, very to "Cancel In is attempt rule guideline and this "Protection the class, with abruptly called Java checking IDE is "HelloWorld" Eclipse is test task the category the a of of when this compliance to the loss work similar to checking data the belonging test unexpected automated preventing in heuristics The on presented the interrupted. focuses interaction of simply user-system and One Interrupts", of from [26]. area Mosier functional and last Smith the is Protection Data Guidelines Protection Data with Experimenting uoaino ersi-ae sblt npcin13/128 / 103 Inspection Usability Heuristic-based of Automation date the NOT on is positioned Focus be "Cursor of labelled to corner is focus returned left Field". cursor error upper Erroneous the the on the and expects Placed detected, in it is cursor located violation because window, the This fails the user, element. test Toolpicker on the Testing the element by a Thus, graphical handled in Evaluation window. first and Heuristic the the prompted of to is Development reset message 5.3. is error position an after application, MyFlight In Concept of Proof 5. Chapter iue5.14: Figure F o iga o udln ./3"usrPaeetFloigError" Following Placement "Cursor 4.3/13 guideline for diagram flow UFT Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin14/128 / 104 inter- Inspection user automation Usability simulate for Heuristic-based to of suitable need Automation Ones the without classes: automation the for step, three suitable second into ones the categorised sequencing, In list event were well-accepted ago. GUI list pre-existing years with 30 a derived taken over from the steps Mosier derived the in and was of Smith guidelines heuristics reminded by of briefly developed set guidelines be minimal usability to a of important First, is it study. study, the the in of results the analysing Before Results the of Evaluation 5.4 outside is this since but with users, passes. of interfere test percentage the the might small heuristic, issue which a this localisation under for of operation This scope system delete the operating the language. Windows confirm and German the to German, the ability to being in the apparent. due (one being is probably text installed problem is message is the usability which program than different compliance English), language the is totally different proves other a a which the of in 5.16, displayed presence figure are the buttons in However, The shown is guideline. message the This after right with operation, button. delete new the delete a around confirm the centered to enters asking are simply pressing message efforts script notification Automation the test it. of Its presence deletes the then protection. detecting Destructive application, data of MyFlight of the Confirmation in area "User order functional titled booking the is to automated, belongs is and verification Actions" whose guideline, final The 5.3: Listing Results the of Evaluation 5.4. Concept of Proof 5. Chapter 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 ֒ ֒ ֒ ֒ ֒ ֒ ֒ ֒ ֒ ֒ ֒ n f I End aaidw("EclipseLcl").JvTb("CaFle_ . ) " CTabFolder_2 " ( JavaTab . ) " Local e s p i l c E " ( JavaWindow S . ) " t S x e . T d e ) l " y t t S x e " T ( d e t l i y t d S E Sampl a " " v ( a J t i . d E ) a Expand " v . a J ) a " J . Local . ) " Tree ) " " e ( s s s p Local e a i e l l r C T c a E v a e " J a s ( J p . i . l ) Java c " ) E JavaWindow " e " s s ( New s p " a i ( l l C c E JavaWindow " JavaWindow ( . ) Java " " JavaWindow Sampl ( e " s New p " i ( l JavaMenu c E . t c " ) e JavaWindow ( " l . e 2 S ) _ e " . l JavaWindow i e ) F s " p " i ( l c Tree E " " ( JavaMenu ( e . e r ) T " a v JavaWindow a e J s p . i l ) c " E e " s ( p i l c E JavaWindow " ( JavaWindow → → → → → → → → → → → e s l e aaidw("EclipseLcl").JvWno aeResour Save " ( JavaWindow . ) " Local e s p i l c E " ( JavaWindow f I nITRUTAction INTERRUPT an s i b a t e h t g n i s o l C ’ ֒ ֒ ֒ ֒ aaidw("EclipseLcl").JvWno aeResource Save " ( JavaWindow . ) " Local e s p i l c E " ( JavaWindow → → → → eotvn iFi aalossocurd","TeT The " , " red ccu o s s o l Data " , micFail wh d e t n e v e r p ReportEvent . r e was t r o p e R Loss Data " , micPass ReportEvent . r e t r o p e R Then ytm.out.println(""Hlo,Wrd:TeGuidelinetitled e l t i t e n i l e d i u G The : World , Hello " " ( n l t n i r p . t u o . System elWrd.java" a v a j . HelloWorld t c e l e S . ) " s s a l C k c i l C . ) " h s i n i F rmDt osisbengtested"");"+vCL }" " + vbCrLf + " ; ) " " d e t s e t g ein b s i Loss Data from n o i t c e t o r P t"Hlool " HelloWorld " et S . brf+" brf+"}" + vbCrLf + "" + vbCrLf + elWrd{ brf+" brf+"public i l b u p " + vbCrLf + "" + vbCrLf + {" HelloWorld s s a l c c i l b u p " "Hlo,Wrd" ntheOtu cen."+vCL " + vbCrLf + " . Screen Output e h t on "" World , Hello "" t u o odmi "+vCL s t n i r P / / " + vbCrLf + {" ) s g r a ] [ g n i r t S ( main void c i t a t s a rmtdtosv " a t a d s i h save o t prompted was r e s u e h t e r o f e b d e s o l c ae").Click c i l C . ) " Save " ( h aehisfilebefore r o f e b e l i f s i h save o t d e r e f f o s i r e s u The " " b a t , " e b h a t t g e h n t i s o l c g n i s o l c orecd ihteGIeetsqec eiyn aapoeto rman from protection data verifying sequence event GUI [1] the action interrupting with code Source e aaeu(" ( JavaMenu . ) " New " t c e j o r P e " t c e j o r P e lsTb" CloseTab etFocus t" brf+ vbCrLf + "" et " ( n o t t u B a v ae:") " : Name " ( t i d E a v e").Exist s i x E . ) " ce n o t t u B a v a J . ) " bwas ab en ∗ Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. erea ae otfntoa ra aesmlrrtivlrtsbigoe u ls o1%o the of 10% to close the but as over data being this rates labels retrieval it similar and have selected the area, areas of functional in functional percentage each Most the divided de- in rate. offers of number originally retrieval also original number table as their The the to interaction [26]. compared presenting user-system Mosier guidelines by and of Smith process, area by filtering report functional aforementioned guideline each the in of guidelines derived results was rived which the heuristics shows report. of 5.2 original set the Table minimal in the presented words, guidelines other the In of (which 9% 5.2). guidelines withheld 85 section of in consisted detailed result all the environment, were desktop Af- Windows [26]. a context-specific, guidelines in too 944 non-applicable obsolete, filtering or redundant, to intuitive, either led deemed interfaces, recommendations user dismissing for ter guidelines Process design Filtering of list Heuristics the the Examining of outcome the of Assessment 5.4.1 its in manually performed be to particu- practice). continue in in should done sequencing method currently event (as inspection GUI entirety usability with this automation whether and or general in evaluation lar, heuristic step in whether each automation comprehend of from to results upon benefit The relied could experiment. and practical guidelines examined, of a summarised, sample in are a study automated step, this are last category the first in Finally, the manually. to checked belonging be should that those and actions, uoaino ersi-ae sblt npcin15/128 / 105 Inspection Usability Heuristic-based of Automation Results the of Evaluation 5.4. Concept of Proof 5. Chapter iue5.16: Figure iue5.15: Figure ofimto esg fat edltdboigodri h ylgtapplication MyFlight the in order booking deleted be to a of message Confirmation F o iga o udln ./ Poeto rmInterrupts" from "Protection 6.0/5 guideline for diagram flow UFT Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin16/128 / 106 user- covering Inspection heuristics Usability technologies, Heuristic-based of in evolution Automation of rate mod- in fast change the of to frequency compared the and to software Compared implemented ern constantly. are appear features new technologies time, new with still and change regularly, guidelines are requirements these world, report development the that software the in fact In guidelines the computers. of mainframe despite portion old-fashioned applications, at big aimed desktop a originally that contemporary would were note context-specific for to too relevant in are important and the explained that is valid of reasons ones It 9% or the than included. guidelines, (for more redundant been heuristics much because have comprised of is have set This would Yet, derived guidelines report. old. the selected original years of minimize 30 amount over to the are need Mosier 5.2), the and section Smith for by weren’t developed heuristics it the if study, this of time the At one with entangled too be not should heuristics ob- derived be the can because as applications. area, is of during this type This Nevertheless, particular from [203]. selected 5.2. other were table each guidelines in inter- notify three served email or only than data process, other derivation share systems heuristic for instance relevant the for were could them recom- users of these these where many revisited that faces, consider Nielsen argued Jakob to he when safe However, 2005, it’s in interfaces. trans- guidelines Therefore, email towards data recom- oriented [26]. "Computer-mediated its be mail" line: to all segment electronic mendations following almost document called the the extension, sometimes include In By even is low designers. report mission GUI is. its the for of area area, intuitive this cause the became describing the outdated or area, obsolete, how either functional to are smallest due mendations the is of rate one retrieval the is guideline areas, which other be of transmission that to data from it Concerning Results far comparison. of the too in (31.5% of not lower report Evaluation is be entire guidelines still the derived would 5.4. of of rate the number retrieval third holds the one if area to even display close Thus, data up precise). making the Moreover, guidelines, of simplicity). favouring number of and highest signals. consist clarity, auditory consistency, (which even similar as are and how such recommendations maps, attributes about these charts, designers behind diagrams, lessons GUI as main advise such the amount However, guidelines elements large context different These the general present to the application. due properly in is to desktop apply rate rarely Windows retrieval would common low that the a guidelines for of repetitive reason somewhat the and area, specific display very data of data the and of display case the data In are 5%. than exceptions lower only rates The retrieval guideline area. with respective transmission each in amount guideline original Concept of Proof 5. Chapter aaprotection Data transmission Data guidance User control Sequence display Data entry Data [26] Area Functional al 5.2: Table udlnsslcindt rue yfntoa neato area interaction functional by grouped data selection Guidelines udlns[26] Guidelines Original 110 184 298 199 70 83 Guidelines Derived 12 20 14 27 9 3 erea Rate Retrieval 12.8% 10.9% 10.8% 13.5% 3.6% 4.6% Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin17/128 / 107 Inspection Usability Heuristic-based of Automation element graphical one placement. of logical of importance tool. its proportion of automation the high correctness GUI measure the the a or cannot to by others, due testing captured to is compared be GUI result cannot automated this that instance, behind properties reason For assess The that guidelines manual display. with display data compatibility data is high its area in This others one the that, testing. from Despite guidelines itself testing. of distinguishes manual number area for overall appropriate functional the those particular areas, to superior functional always all is in automation that for is suitable observed be can that a for aspect check As Another to GUI. order the the in with interactions at GUI interacting looking same by and the violations. application reproduce evaluated guideline to the particular be has launching a evaluation only involve automated by that can their observed result, areas guidelines be these sequencing. cannot these in and event guidelines words, directly, of with other GUI amount evaluation In high entry automated the data for scenario. to called due suited usage interaction is user-system most this of two for areas reason the functional The presented the are that data control noted the sequence be through and can examined It be 5.3. can table used feasibility by be evaluation can guidelines. automated derived tool about the testing details of GUI reason- More half a over a that of require deduced automation tests the be these the in also as assist of could long to 75% it (as Moreover, to evaluation effort). up for heuristic maintenance that decrease manual able mean could complete results inspections a These in usability for categories). automated automated needed first be effort heuristics, the two can of of the it half set up over of summing particular chart, quarters by this the three noted in be and observed can sequencing, be (which can event general As with percentages 5.17. automated in figure be expressed in and can shown are manually, guidelines category chart evaluated in each pie be listed of a better are proportions in The presented These ought and 5.2.4. that test. section guidelines the in has performing described ex- class for are automation, third they events for the GUI all Finally, suitable of are sequence also 5.2.3. they a heuristics section and require contains testing), don’t category GUI they second script-based that The in cept as 5.2.2. automation manner com- section for same their suitable in the to guidelines listed (in according the sequencing lists holds event category three GUI first into with The divided is automation. with heuristics degree usability patibility of set minimal Evaluation derived Heuristic The in Feasibility Automation of Analysis test. 5.4.2 GUI script-based automated sequence typical set a a derived as re-enacts the manner automation of same when suitability the especially, the in and mind, actions, in analysed, user this of is With these automation 4.1. between with section differences automation heuristics in and test of details similarities GUI in of the covered field Furthermore, are the fields inspection 3.2.3. with two usability associated section this benefits in the of described Results of automation are the some the which of automating gain why Evaluation to why reasons are Other is studied 5.4. heuristic appealing. which is a so method counter-intuitive, would performing is get Naturally, which process might this start, 3.3.2. iteration of the part section design from in single the reached discussed every development, be benefits in in might evaluation usability cost degree on the development overall usability of early the moderate some result, enforced a a yield As is and reduced. down, guidelines be would go design phase would design of a set in iterations a of than number if change to background, subject be this to Given likely stay less to also expected are by are They (approved guidelines time. guidelines of requirements. of these period functional set application, long a an a Once over to relevant time. bound and valid of and face selected the are in experts) sturdy usability fairly be can interaction system Concept of Proof 5. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin18/128 / 108 In guideline. a by Inspection abides re- Usability GUI to Heuristic-based the of has Automation if sequencing check event to GUI order with in automation interaction user-system for one suited area least guideline functional at single what enact every matter all, no of categories First three the of one to. each analysing belong common in that they of means guidelines observation This the the to observed. between leads be traits inspections can usability pattern heuristic-based a of guidelines 5.2), the feasibility section of automation nature (from the listing on particular based particular, each in in and results, classification presented the on Based 5.17: Figure Results the of Evaluation 5.4. Concept of Proof 5. Chapter al 5.3: Table aaprotection Data transmission Data guidance User control Sequence display Data entry Data [26] Area Functional vrl lsicto eut fe nlsso etatmto esblt naderived a on heuristics feasibility adequate automation of test set of analysis after results classification Overall esblt fhuitceauto uoaingopdb ucinlarea functional by grouped automation evaluation heuristic of Feasibility Automation Sequencing ihEvent with 14 16 4 2 6 5 udlnsSie for: Suited Guidelines Automation Sequencing without 2 1 2 1 3 8 3 0 4 5 6 3 Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin19/128 / 109 Inspection Usability Heuristic-based of Automation these and experiment, practical following: the the to thanks are evident a made process. only using were automation usability that observations the some heuristic-based and simplifies Moreover, greatly feasible, that UFT) is prove as approach (such practice event-driven tool in in an testing described GUI programs using were applications such instances Windows of automated of usability existence these inspection for The of check that all created 5.3.2). (and were defects section tests GUI identifying heuristic- 14 of some result, instead automating a by As violations practice experiment. in an challenged in was inspections at results based classification (or the solve of might validity techniques The learning machine [128]. ought date, challenge they automation future thus this tests, a alleviate) these in least However, performing consistency for manually. linguistic needed tested its is or be to user, perspective be to that the human would for means A examples importance This elements. its Some element, similar tool. GUI with process. automation a but deduction GUI of value GUI, a cognitive aesthetic by the human the captured on judge a be observable through cannot directly properties assessed not them desired is of be the that all can property among that a trait one capturing common rather require a all testing, They manual for or observed. suited messages, is guidelines error the of of length and examination values. the accessed Upon default checking be predefined as can the such checked with GUI, be working the to of on attribute side-effects displayed the the be even because can is system guideline it do the a before properties with of verified specific compliance end their testing back of words, cases, the capture some other in the In In performed and interaction. state, check user-system sequencing. one can on event through test depend requiring go not automated whose only without an test properties aligned example, under Results GUI visually For elements the are observable GUI of action. Evaluation elements checking user GUI involve external 5.4. some an all if on automation They depend for not suitable do sequence. guidelines assessment all events between GUI traits after a common and without some before also elements are with there GUI Subsequently, compliance displayed of the interaction. verification user-system of of a the properties evaluation simulating) if (or the more the sequencing, performing through checking event Therefore, pass requires GUI can with guideline steps. automated GUI that multiple the be a comprises and can Additionally, evaluated begins, heuristic GUI. being interaction a the the interaction with before the observed interacts if is user states goes state the it GUI while after first GUI starts The the examine second states: would two guidelines least these of at one through each evaluating test the words, other Concept of Proof 5. Chapter • ih lowr oehrwt sblt xet oefiinl uoaehuitcevaluations. heuristic to- automate efficiently work to can testers experts developers software usability interfaces, with GUI their together like of work Just quality also the might improve automation. to of professionals out could usability and with most experts time, gether the usability of make and period to testers long together software a work Therefore, over behaviour. than users user scenarios observed usage studied realistic who extensively selecting at professional better usability flexibility, be of could experienced level anyone an satisfying that a imagine challenging to ensure testing be hard scenarios is scenario which might it confidence It to but with 3.1.3). choose similar to section checking process tester in a a instance, defined for technique For become on specification-based can impact a consider. is stronger flexibility to (which with GUI scenarios aspects with hypothetical GUI dealing which prioritize guidelines to specify order to professional initial in usability or the a testers usability, extension, Therefore, with of by extensive. amount together and be the high, work also quite GUI, can can be automation entire the can an for test across required to guideline effort elements a GUI with or compliance operations check possible to cases, some In Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin10/128 / 110 Inspection Usability Heuristic-based of Automation Results the of Evaluation 5.4. Concept of Proof 5. Chapter • • • te.Tedcso oefreacrangieieoe nte hudb oigfo the from coming be designers. should of perspective another the qual- over from desirable guideline not two certain and between a users needed of guideline enforce perspective is one to trade-off Choosing decision a The logic. mean order. grouping ities. can alphabetical one ones or only conflicting use, with if many of compliance check among frequency check that importance, to evaluation guidelines sufficient function, their distinct is its if are even It by there time, instance, either same For grouped the GUI. is same at data the GUI on a automated on be applied can be sample. not user a should requires guidelines always Some process usability discussed appraisal was the individual interface summary, the an in GUI cannot of and of the attribute evaluation 2.2.3, severity usability heuristic if section the the in automated Measuring check an assess usability. is measure sense, not to do general used do can the be They in tests Even the guidelines. uncovered. All of violations set automated. a fully by is abides evaluation heuristic if about Even knowledge prior requiring messages without can error tests, that is multiple guidelines GUI if user of event-driven usability. sort checking a into the or converted are if exit, message), easily checking one to formulated be just instance, trying are of For is (instead heuristics consecutively he these prompted interpretation. when are because data for is unsaved room This of no notified expert cases. almost usability test leave a the for and review need clearly, the or without testers, automated guide be can to heuristics many of evaluation The Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin11/128 / 111 Inspection consisting Usability set event heuristic Heuristic-based a the of with and Automation conform tool and testing select the GUI to that team a a guidelines on for selected relying possible the by is it of Therefore, technique, nature by. sequencing the abide on to differ may decides depending manually team and checked development project heuristics be software of should percentage that the the those to application, and according same automation the for are on suitable options once is at menu evaluation applied whose and be ordered can heuristics are all items not Since data which by logic aesthetic the the examining GUI judging grouped. or guidelines these GUI, in a of evaluated the of assessment manually expertise of the are The sense requires attributes sometimes such tool. the and instance, automation process, For to cognitive test GUI specialist. human contribute GUI around a might revolving through a guideline deduced values by every is length default captured qualities for text be advised offered the is cannot the checking evaluation that when if manual attributes as Finally, on testing such loss. depend when tests, data not unit or of do some risk that messages, in aspects even error on verified stored is be recommendation of can the and alignment of itself visual focus one GUI as the least the such if manual at attributes the or triggering observable if elements, GUI, easily appropriate graphical more assessing the of involves is with sequencing guideline interact event a without to of automation evaluator evaluation sequencing hand, the event other with requires the automatically test On verified manual event. be its can belonging as guideline heuristics a long all words, as between other uncovered In was a category. pattern each into defects. common to software converted a of work, successfully instead performed issues was the usability verification Through identify normally to guideline tool is objective automated testing whose event-driven Each test an GUI script-based on testing. performed GUI was in and GUI used applications with guidelines, software two automation different of for fourteen compliance appropriate the as with checking judged involved been experiment implementation The has testing. the that sequencing. manual through evaluations event for practice heuristic suited in of more challenged number was were a were classification of them guidelines provided of the the 25% of of of Only 75% validity 55% while The classifying general. sequencing, to in event led automation GUI basis, with with case automation compatible by for Analysing case suitable applications. a be desktop on to Windows heuristics, modern them discussing derived be to tailored the after set and of is Finally, selected feasibility which was use, automation of guidelines field. context usability respective specific of a each review source fit to reputable of a a alongside automation durability, provided their the was and in testing heuristics Afterwards, assisting GUI and presence. tools evaluation inspection user available usability require presented, of of not methods comparison do all they depth among as in that potential an deduced automation was greatest the It have success methods efforts. and automation progress us- testing said current of the of software compatibility discussing degree the and including assess automation, engineering, with to methods usability presented evaluation was GUI, ability taxonomy to Helpful relating considered. was vocabulary fundamentals and literature particular. in the evaluation sequencing heuristic event First, of GUI compatibility with and the general, analyse in heuristic-based to of automation needed automation normally with was the technique it on a effect Therefore, being positive inspection. a despite sample. have usability sequencing, user would event a testing, GUI observe GUI that script-based to thesis in need this used not in does per- theorized evaluator generally been the being has though is even It that entirety method its inspection in usability manually simple formed and cheap a is evaluation Heuristic Conclusion 6 Conclusion 6. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin12/128 / 112 Inspection Usability Heuristic-based of Automation would efforts those All options. evaluation. menu heuristic the of manual of and grouping value automated aesthetic logical between the gap the judging the of as close further such correctness manually the evaluation tested the be or automate only to interface, learning. currently techniques machine can learning machine in that on heuristics lies rely of potential in to is additional possible to automation be an effective relation might how Furthermore, in it measure Eventually, GUI and practice. designed observe in to the evaluator researcher of the the quality assess assisting enable the and also confirm would on to It inspection considerable examined cost. usability be a its automated can automate studies an to such of of experts impact which result usability the in The observed with evaluation. be heuristic collaborate a would testers of studies portion its case has where extensions team future development in a done process, be evaluation issues. can automated identified providing analysis the of by Deeper or severity overseeing the beginner for as deducing a required such and as also results, efforts is task overall testing presence the a validating His performing guide for user. to expert sequences re- needed an action be is as relevant cannot selected evaluator and evaluator of scenarios the usability set usage the of and reasonable the of expertise achieving role when The quickly the even of automation, However, goal placed. with the usability. compatible has 100% of rather is degree does but guidelines evaluation appropriate issues, automated an usability of sustaining existing kind all cheaply This identify alongside to changes. or process aim design usability design some not heuristic-based iterative committing result, an after during a repeatedly tests As and regression easily automated. performed be be can can evaluation inspection whose guidelines of entirely Conclusion 6. Chapter Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin13/128 / 113 Inspection Usability Heuristic-based of Automation References Bibliography Bibliography 1]A erka ta.„lxberprigfratmtduaiiyadacsiiiyevaluation accessibility and usability automated for reporting „Flexible al. et Beirekdar A. [16] „KWARESMI–Knowledge- Noirhomme-Fraiture. M. and Vanderdonckt, J. Beirekdar, A. [15] 1]E .Cie l Tebodon rjc:atmtn icvr fwbuaiiyissues usability web of discovery automating project: bloodhound „The al. et Chi H. E. [18] Dingli. A. and Mifsud J. [17] in Assistance the for Tool A „WebA: Lapena. L. E. and Andrés, L. M. P. Tobar, M. L. [13] In: devices“. mobile touchscreen-based for heuristics „Usability al. et Inostroza R. [12] Cardello. J. [10] 1]J ile.„nacn h xlntr oe fUaiiyHuitc“ In: Heuristics“. Usability of Power Explanatory the „Enhancing Nielsen. J. [11] 1]M .J aht.„raiedsg fitrciepout n s fuaiiyguidelines-a usability of use and products interactive of design „Creative Machate. J. B. M. [14] 2 .M eo,M .Plak n .L of.„sn oldie praht generate to approach goal-driven a „Using Soffa. L. M. and Pollack, E. M. Memon, M. A. [2] 7 .DS 94-1 98 rooi eurmnsfrofiewr ihvsa ipa terminals- display visual with work office for requirements Ergonomic 1998. „9241-11: DIS. In: I. testing“. regression GUI for [7] paths „Critical Jie. H. and Ames K. A. [3] 8 .Nielsen. J. [8] 9 .Cardello. J. [9] ona at Cruz Santa fornia, inlCneec on Conference In: tional GUIs“. for cases test at1:Giac nuaiiy.In: usability“. on Guidance 11: Part Switzerland fwbsts.In: sites“. web of Optimization“. UAHCI’2003 Guidelines Reconfigurable with Tool In: Evaluation Automated Web based sn h InfoScent the using Evaluation Automated In: Websites.“ of Evaluation and Design on Conference International 662–667. Ninth pp. 2012, 2012 IEEE. (ITNG), Generations New Technology: mation 27. tices ascuet,UA C,19,p.152–158. Systems pp. Computing 1994, in ACM, Factors USA: Human on Massachusetts, Conference SIGCHI the of 2019-05- Accessed: 27. https://www.nngroup.com/articles/analytics-reports-ux-strategists. atr ncmuigsystems computing in factors otaito? In: contradiction?“ 191666.191729. rc f2dItrainlCn.o nvra cesi ua-optrInteraction Human-Computer in Access Universal on Conf. International 2nd of Proc. tp:/w.nru.o/rilsaayisue-xeine cesd 2019-05- Accessed: https://www.nngroup.com/articles/analytics-user-experience. . sblt engineering Usability ile omnGop ieEsnilAayisRprsfrU Strategists UX for Reports Analytics Essential Five Group, Norman Nielsen ile omnGop he ssfrAayisi srEprec Prac- User-Experience in Analytics for Uses Three Group, Norman Nielsen (1998). 2003. . URL ua-optrItrcinITRC 2005 Interaction-INTERACT Human-Computer (2004). ua-optrItrcin hoyadPractice and Theory Interaction: Human-Computer π EE 99 p 257–266. pp. 1999, IEEE. . http://doi.acm.org/10.1145/191666.191729. : iuao“ In: simulator“. A ABR cdmcPbihn,2012. Publishing, Academic LAMBERT LAP . SFl rmwr oMisra e ieUaiiyThrough Usability Site Web Mainstream to Framework A USEFul: otaeEgneig 99 rceig fte19 Interna- 1999 the of Proceedings 1999. Engineering, Software C.20,p.505–512. pp. 2003, ACM. . leir 1993. Elsevier, . rceig fteSGH ofrneo Human on conference SIGCHI the of Proceedings nentoa tnadzto raiain(ISO). Organization Standardization International .UCS J. 49(08,p.1496–1512. pp. (2008), 14.9 ISBN ISBN 9780080520292. : 0-89791-650-6. : 20) p 281–294. pp. (2005), H 9.Boston, ’94. CHI . nvriyo Cali- of University 20) .43. p. (2003), 1 DOI Proceedings 0.14 / 1145 . 10 : Infor- . . Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin14/128 / 114 Inspection Usability Heuristic-based of Automation Bibliography 3]J .GudadC ei.„einn o sblt:kypicpe n htdesigners what and principles key usability: for „Designing Lewis. C. and Gould D. J. [33] 3]S Möller. S. [34] 3]I I.„211:19.Egnmcrqieet o fc okwt iuldslyterminals- display visual with work office for requirements Ergonomic 1998. „9241-17: DIS. I. [32] 3]I I.„2120 00 rooiso ua ytmitrcinPr 1:Human- 210: interaction-Part system human of Ergonomics 2010. „9241-210: DIS. I. [35] 2]I I.„211:19.Egnmcrqieet o fc okwt iuldslyterminals- display visual with work office for requirements Ergonomic 1998. „9241-12: DIS. I. [24] 3]I I.„S/E 56:20.Sfwr niern,Sfwr rdc ult Require- Quality product Software engineering, Software 2006. 25062: „ISO/IEC DIS. I. [36] Fowler. S. Mosier. N. [27] J. and Smith L. S. [26] terminals- display visual with work office for requirements Ergonomic 1998. „9241-13: DIS. I. [25] 2]I I.„2110 06 rooiso ua ytmitrcinPr 1:Dialogue 110: interaction-Part system human of Ergonomics 2006. „9241-110: DIS. I. [23] 3]I I.„211:19.Egnmcrqieet o fc okwt iuldslyterminals- display visual with work office for requirements Ergonomic 1998. „9241-16: DIS. I. [31] terminals- display visual with work office for requirements Ergonomic 1998. „9241-15: DIS. I. [30] terminals- display visual with work office for requirements Ergonomic 1998. „9241-14: DIS. In: I. Interface“. User Graphical [29] „The Camerlengo. T. and Patterson, J. Habibi, M. [28] 3]J .Lws A fe-cnroqetoniefruaiiysuis scoercevalua- psychometric studies: usability for questionnaire after-scenario „An Lewis. R. J. [37] 2]D Saffer. D. [22] Chandler. C. and Unger R. [21] In: design“. „User-interface Oppermann. R. [20] 1]J Nielsen. J. [19] at1:Fr ligdaous.In: dialogues“. filling Form 17: Part Switzerland ete einfritrciesses.In: Switzerland systems“. (ISO). interactive for design centred 57–74. pp. 2010, In: think“. Switzerland at1:Peetto fifrain.In: information“. of Presentation 12: Part et n vlain(QaE,Cmo nutyFra CF o sblt etre- test usability for (CIF) Format Industry In: Common ports“. (SQuaRE), Evaluation and ments 1986. MA, Bedford, Corporation In: land guidance“. User 13: Part rnils.In: principles“. 1–29. pp. 2010, Riders, at1:Drc aiuaindaous.In: Switzerland dialogues“. (ISO). manipulation Direct 16: Part land In: dialogues“. Command 15: Part land In: dialogues“. Menu 14: Part 1.4 J2SE with Exam Developer Java Certified inoe he ras.In: trials“. three over tion ntefil ri h making the in or field the in training and education 2019-05-27. Accessed: nngroup.com/articles/return-on-investment-for-usability/. (1998). (1998). (1998). einn o neato:cetn noaieapiain n devices and applications innovative creating interaction: for Designing nentoa tnadzto raiain(S) Switzerland (ISO). Organization Standardization International U einhandbook design GUI omnctoso h ACM the of Communications ult niern:Qaiä omnktosehice Systeme kommunikationstechnischer Qualität engineering: Quality ile omnGop euno netetfrUsability for Investment on Return Group, Norman Nielsen (1998). (1998). nentoa tnadzto raiain(S) Switzerland (ISO). Organization Standardization International (1998). 21) eiwdadCnre n2015. in Confirmed and Reviewed (2010). pigr 02 p 233–248. pp. 2002, Springer, . C ICIBulletin SIGCHI ACM rjc ud oU ein o sreprec designers experience user For Design: UX to Guide Project A e ies 02 p 1–15. pp. 2012, Riders, New . nentoa tnadzto raiain(S) Switzer- (ISO). Organization Standardization International nentoa tnadzto raiain(S) Switzer- (ISO). Organization Standardization International udlnsfrdsgigue nefc software interface user designing for Guidelines crwHl,Ic,1998. Inc., McGraw-Hill, . nentoa tnadzto raiain(S) Switzer- (ISO). Organization Standardization International nentoa tnadzto raiain(ISO). Organization Standardization International 83(95,p.300–311. pp. (1985), 28.3 nentoa tnadzto raiain(ISO). Organization Standardization International nentoa tnadzto Organization Standardization International nentoa tnadzto Organization Standardization International adoko nomto ehooisfor technologies information on Handbook pigr 02 p 189–262. pp. 2002, Springer, . 34(91,p 79. p. (1991), 23.4 ISBN 978-3-540-74155-8. : ISBN 978-0070592742. : tp www. / / : https . (2006). Springer, . (2006). h Sun The Mitre . New . Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin15/128 / 115 Inspection Usability Heuristic-based of Automation Bibliography 5]P manadJ Offutt. J. and Ammann P. [58] Non- on Perceptions Members Team „Agile Cruzes. S. D. and Marczak, S. Camacho, R. C. In: Beatty. software“. J. [57] computer and Wiegers of K. testing dynamic and [56] analysis Static „Tutorial: Fairley. E. R. [55] In: testing“. software static to analysis code source „From al. et Wei W. Tech- Testing Software Various [54] Investigate to Approach Innovative „An al. et Jan R. S. [53] Feathers. M. [52] 5]M .Ka,F hn ta.„ oprtv td fwiebx lc o n rybox grey and box black box, white of study comparative „A al. et Khan, F. Khan, E. M. [51] 4]J e ae.„nItouto oSeai etn“ n (2003). Hass. In: M. Testing“. A. Scenario to Introduction „An [50] Kaner. Cem J. al. [49] et Fairley, E. R. Bourque, P. [48] test- Software engineering, systems and Software 29119-4:2015. „ISO/IEC/IEEE DIS. I. (2015). In: Terms“. Testing of [47] „Glossary ISTQB. I [46] 4]A .Hass. M. A. [45] 4]P .Jorgensen. C. P. Badgett. [44] T. and Sandler, C. Myers, J. G. [43] In: Vocabulary“. engineering, software and Systems 24765:2010. „ISO/IEC/IEEE DIS. I. In: anomalies“. software [42] for classification standard „IEEE Board. S. I.-S. [41] 4]A plnr .Ln,adH Schaefer. H. and Linz, T. Spillner, A. [40] Nielsen. J. [39] In: estimation“. magnitude „Usability McGee. M. [38] ern n ehooy(JRE) rn ISSN Print (IJSRSET), Technology and neering 2016. on Conference International 589. 11th 2016 In: (ARES), Study“. Security Empirical and an bility from Factors Influencing Testing: functional Computer 1280–1283. on Workshop pp. IEEE 2014 (WARTIA), Applications Industry in Technology and In: Strategies“. and niques etn ehius.In: techniques“. testing ain (IJACSA) cations SEO R) eso 3.0 Version (R)): (SWEBOK land In: techniques“. Test 4: Part ing, 1466560680. ie os 01 p 21–42. pp. 2011, Sons, & Wiley 2016. in firmed Switzerland (ISO). Organization Standardization International (2009). etfidtse exam tester certified 2019-05-27. Accessed: usability-metrics/. 691–695. pp. 2003, CA. les, Meeting Annual Society Ergonomics (2015). 14(98,p.14–23. pp. (1978), 11.4 ile omnGop sblt Metrics Usability Group, Norman Nielsen ud oavne otaetesting software advanced to Guide ud oavne otaetesting software advanced to Guide okn fetvl ihlgc code legacy with effectively Working otaetsig rfsa’ approach craftsman’s a testing: Software . (2012). 3.6 ok ok n. 2014. Inc., Nook, Rocky . nentoa ora fAvne optrSineadAppli- and Science Computer Advanced of Journal International otaerequirements Software nrdcint otaetesting software to Introduction nentoa ora fSinicRsac nSine Engi- Science, in Research Scientific of Journal International EECmue oit rs,2014. Press, Society Computer IEEE . nentoa tnadzto raiain(S) Switzer- (ISO). Organization Standardization International ud otesfwr niern oyo knowledge of body engineering software the to Guide o.4.4 AEPbiain aeC:LsAnge- Los CA: Sage Publications SAGE 4. 47. Vol. . otaetsigfudtos td ud o the for guide study a foundations: testing Software h r fsfwr etn,TidEdition Third testing, software of art The rehHue 04 p 1–23. pp. 2014, House, Artech . rehHue 2014. House, Artech . 21) p 2395–1990. pp. (2016), ero dcto,2013. Education, Pearson . rnieHl rfsinl 2004. Professional, Hall Prentice . rceig fteHmnFcosand Factors Human the of Proceedings https://www.nngroup.com/articles/ . R rs,2016. press, CRC . abig nvriyPress, University Cambridge . 21) eiwdadCon- and Reviewed (2010). EE 06 p 582– pp. 2016, IEEE. . dacdResearch Advanced viaiiy Relia- Availability, EEStd IEEE EE 2014, IEEE. . ISBN John . 978- : 1044 Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin16/128 / 116 Inspection Usability Heuristic-based of Automation Bibliography 7]N ea.„otbnfiseiec n aesuis.In: studies“. case and evidence benefits „Cost Bevan. N. [76] 7]J ile.„trtv sritraedsg“ In: design“. user-interface „Iterative web: Nielsen. the J. for methods [75] evaluation „Usability Abrahão. S. and Insfran, E. Fernandez, A. [74] 7]M hoao n .Qeebr.„oad h eino fetv omtv etreports“. test formative effective of design the „Towards Quesenbery. W. and Theofanos M. [73] 7]H .Hrsn .S nr,adR .Wlie.„rtrafreautn sblt evalu- usability evaluating for „Criteria Williges. C. R. and Andre, S. T. Hartson, R. H. [72] In: evaluation“. „Usability Scholtz. J. [71] In: testing“. software manual and automated between „Trade-off al. et Taipale O. [70] graph- of techniques testing capture-replay and model-based „Combining al. et Entin V. [69] In: model“. [68] metrics reusability case „Test al. et Juan Z. In: testing“. performance [67] GUI „Automated al. et Adamoli A. [66] pro- object-oriented of testing unit differential for framework a „Towards al. et Xie T. In: [60] testing“. continuous via time development wasted „Reducing Ernst. D. M. and Saff D. [59] 6]K iadM Wu. M. and Li K. [65] automated from learned lessons and „Observations Keller. K. R. and Weber, R. Berner, S. [62] 6]E utn .Gret n .Gauf. B. and Garrett, T. Dustin, E. [63] Martin. C. R. [61] 6]C .Jne,M .Paa,adA ølr Atmtdtsigwt agtdeetse- event targeted with testing „Automated Møller. A. and Prasad, R. M. Jensen, S. C. [64] aefrteitre g.SnFacso ognKaufmann Morgan Francisco: San age. internet the for date ytmtcmpigsuy.In: 789–817. study“. pp. mapping systematic A In: 145–181. pp. to ehd“ In: methods“. ation (2004). Management and Engineering Assurance 125. System of Journal tional on Conference 572–577. International pp. Fourth IEEE 2011 In: (ICSTW), approach“. Workshops industrial dation An interfaces: user ical 2019-05-27. Accessed: Testing us/library/ee620469.aspx. Model-Based Network, Developer Microsoft on Conference International 2nd 2010 (ICCTD), opment 801–839. pp. (2011), 19.4 tool Test In: grams“. on Symposium International 281–292. 14th pp. 2003. 2003, ISSRE IEEE. 2003. Engineering, Reliability Software ofrneon Conference In: testing“. 77–84. pp. 2014, KG, Co. & GmbH lags etn n Analysis and Testing In: generation“. quence quality raising while costs lower and time ora fuaiiystudies usability of Journal onWly&Sn,2006. Sons, & Wiley John . EECmue oit.20,p 5. p. 2007, Society. Computer IEEE . rceig fteScn nentoa okhpo uoaino Software of Automation on Workshop International Second the of Proceedings otaeEgneig 05 CE20.Poedns 7hInternational 27th Proceedings. 2005. ICSE 2005. Engineering, Software la oe:ACd fCnutfrPoesoa Programmers Professional for Conduct of Code A Coder: Clean EE 05 p 571–579. pp. 2005, IEEE. . fetv U etn uoain eeoiga uoae U testing GUI automated an Developing automation: testing GUI Effective nentoa ora fHmnCmue Interaction Human-Computer of Journal International C.21,p.67–77. pp. 2013, ACM. . rceig fte21 nentoa ypsu nSoftware on Symposium International 2013 the of Proceedings . 20) p 27–45. pp. (2005), 1.1 mlmnigatmtdsfwr etn:Hwt save to How testing: software automated Implementing nomto n otaeTechnology Software and Information ainlIsiueo tnad n Technology and Standards of Institute National ero dcto,2009. Education, Pearson . Computer otaeTsig eicto n Vali- and Verification Testing, Software https://msdn.microsoft.com/en- . optrTcnlg n Devel- and Technology Computer EE 00 p 294–298. pp. 2010, IEEE. . otjsiyn sblt:A up- An usability: Cost-justifying 61 19) p 32–41. pp. (1993), 26.11 (2005). otaeQaiyJournal Quality Software . 21) p 114– pp. (2011), 2.2 EE 2011, IEEE. . 38(2011), 53.8 51(2003), 15.1 ipVer- mitp . Interna- . Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin17/128 / 117 Inspection Usability Heuristic-based of Automation Bibliography 8]J Nielsen. J. [88] Holtzblatt. K. and Beyer In: H. questioning“. indirect of validity [87] the and bias desirability „Social Fisher. J. R. [86] 8]J Nielsen. J. [89] 9]J ile n .Lv.„esrn sblt:peeec s efrac“ In: performance“. vs. preference usability: „Measuring Levy. J. and Nielsen J. [90] 9]I I.„2111 08 rooiso ua ytmitrcinPr 5:Giac on Guidance 151: interaction-Part system human of Ergonomics 2008. „9241-151: DIS. I. [95] [94] In: [93] Guidelines“. Interface Human „iOS Developer. A. [92] [91] 8]J Nielsen. J. [85] 8]J ekne l Hwt s neato osefcieyfruaiiyeauto“ In: evaluation“. usability for effectively logs interaction concurrent use to the „How al. to et alternative Gerken J. an as method [82] teaching „A Helander. M. and Vora R. P. [81] 8]J ile n .K adur Amteaia oe ftefidn fuaiiyprob- usability of finding the of model mathematical „A Landauer. K. T. and Nielsen J. [84] Schade. A. [83] 8]J Nielsen. J. [80] 7]M .IoyadM .Has.„h tt fteati uoaiguaiiyeauto of evaluation usability automating in art the of state „The Hearst. A. M. and Ivory Y. M. [79] al. et Clarkson J. P. [78] 7]J Nielsen. J. [77] ir 1997. vier, research consumer of nru.o/rilsfcsgop/ cesd 2019-05-28. Accessed: nngroup.com/articles/focus-groups/. ain fteACM the of cations Accessed: users/. to- listen- dont- usability- of- 2019-05-28. rule- //www.nngroup.com/articles/first- ol ieWbue nefcs.In: Switzerland interfaces“. user Web Wide World 2019-05-28. Accessed: 6.5) com/en-us/library/bb158602.aspx. Mobile (Windows Guidelines Network, Developer Microsoft 2019-05-28. Accessed: Guidelines practices/ui_guidelines/index.html. Interface User Developers, Android (2012). 2019-05-28. Accessed: (Desktop) us/library/windows/desktop/dn688964(v=vs.85).aspx. Guidelines Network, Developer Microsoft 2019-05-27. Accessed: nngroup.com/articles/why-you-only-need-to-test-with-5-users/. LIV 375–380. In: pp. testing“. (1995), 20 usability for method think-aloud Accessed: tool/. usability- 1- the- aloud- 2019-05-27. https://www.nngroup.com/articles/thinking- optn systems computing In: lems“. 2019-05-28. Accessed: https://www.nngroup.com/articles/remote-usability-tests/. sritrae“ In: interfaces“. user 2013. Media, Business & 1999. 2008. . rceig fteITRC’3adCI9 ofrneo ua atr in factors Human on conference CHI’93 and INTERACT’93 the of Proceedings ile omnGop h o nyNe oTs ih5Users 5 with Test to Need Only You Why Group, Norman Nielsen ile omnGop eoeUaiiyTss oeae n Unmoderated and Moderated Tests: Usability Remote Group, Norman Nielsen ile omnGop is ueo sblt?DntLse oUsers to Listen Don’t Usability? of Rule First Group, Norman Nielsen ile omnGop h s n iueo ou Groups Focus of Misuse and Use The Group, Norman Nielsen einn e sblt:Tepatc fsimplicity of practice The usability: web Designing ile omnGop hnigAod h ubrOeUaiiyTool Usability One Number The Aloud: Thinking Group, Norman Nielsen (2008). C.19,p.206–213. pp. 1993, ACM. . 74(94,p.66–75. pp. (1994), 37.4 C optn uvy (CSUR) Surveys Computing ACM nlsv ein einfrtewoepopulation whole the for Design design: Inclusive 02(93,p.303–315. pp. (1993), 20.2 otxuldsg:dfiigcsoe-etrdsystems customer-centered defining design: Contextual nentoa tnadzto raiain(ISO). Organization Standardization International dacsi ua Factors/Ergonomics Human in Advances https://developer.android.com/guide/ . https://msdn.microsoft.com/en- . 34(01,p.470–516. pp. (2001), 33.4 srEprec Documentation Experience User e iesPublishing, Riders New . https://msdn.microsoft. . pigrScience Springer . https://www. . https://www. . Communi- Journal https: . Else- . BE- . . Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. 13 .I odnadT .Mthl.„ahn erig rns esetvs n prospects“. and perspectives, Trends, learning: „Machine Mitchell. M. T. and Jordan I. M. [113] 11 .Ruebr.„ er e ae nlzn n oeigto i o ofie nhu- in logfiles for kit tool modeling and analyzing based net Petri „A Rauterberg. M. Ap- [111] Triangulation Impact: User Design Dashboard Business „Understanding In: applications“. Bacic. web D. in smells usability [110] of detection „Automatic al. et Grigera J. [108] 14 .Otkne l Amcielann-ae sblt vlainmto o eLearning for method evaluation usability learning-based machine „A al. et Oztekin A. [114] 16 .Otkn Adcso upr ytmfruaiiyeauto fwbbsdinformation web-based of evaluation usability for system support learning decision machine „A of Oztekin. A. implementation and „Evaluation Reza. [116] H. and Kim, E. Korvald, C. [115] Nielsen. J. [107] 16 .Cadl,G .Ken n .R Hoffman. R. R. and Klein, A. G. Crandall, B. In: [106] interfaces“. user of design the to approach based ETAG „An Haan. de G. [105] 10 .Wxne l Isetosaddsg eiw:faeok itr n eeto“ In: reflection“. and history framework, reviews: design and „Inspections al. et Wixon D. [100] 14 .Prie .Shd,adP Caya. P. and Schade, A. Pernice, K. [104] uoaino ersi-ae sblt npcin18/128 / 118 Inspection Usability Heuristic-based of Automation research of years 15 after methods inspection „Usability Novick. G. D. and Hollingsed T. [101] Bibliography 13 .Prie .Shd,adP Caya. P. and Schade, A. Pernice, K. [103] 12 .Nielsen. J. [102] 9]J ile.„sblt npcinmtos.In: methods“. inspection „Usability Nielsen. J. [99] In: evaluation“. heuristic through problems usability „Finding Nielsen. J. [98] 9]T aaoy .Sgr n .Klk.„tt fteato h ontv walkthrough cognitive the Nielsen. J. on art the [97] of „State Kolski. C. and Sagar, M. Mahatody, T. [96] ainlJunlo ua-optrStudies Human-Computer of Journal national In: Control acmue neato“ In: interaction“. mancomputer Sen- EEG and Response Skin (2017). Galvanic In: Expression, sors“. Facial Eye-Tracking, Using proach Accessed: sales/. costs- scent- information- 2019-05-28. https://www.nngroup.com/articles/wrong- ytm“ In: systems“. ytm“ In: systems“. (2014). In: sites“. web for testing usability in techniques analysis task cognitive 1994) (Schaerding, Interdisciplinary Design Psychology, and and Analysis Informatics System to on Approaches Workshop Interdisciplinary 15th the of tp:/w.nru.o/rilsitae-ein.Acse:2019-05-28. Accessed: https://www.nngroup.com/articles/intranet-design/. ncmuigsystems computing in systems computing in factors Human 380. on conference SIGCHI the 2019-05-28. Accessed: nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/. sblt npcinmethods inspection Usability 68(00,p.741–785. In: pp. (2010), evolutions“. 26.8 and variants its method, tp:/w.nru.o/rilsitae-ein21/ cesd 2019-05-28. Accessed: https://www.nngroup.com/articles/intranet-design-2016/. n rcie.In: practice“. and fcommunication of rils1-etitaeso-05.Acse:2019-05-28. Accessed: articles/10-best-intranets-of-2005/. Science 19) p 268–275. pp. (1996), ile omnGop o oCnutaHuitcEvaluation Heuristic a Conduct to How Group, Norman Nielsen ile omnGop 0Bs nrnt f2005 of Intranets Best 10 Group, Norman Nielsen 4.25(05,p.255–260. pp. (2015), 349.6245 ile omnGop eevnl togIfrainSetCssSales Costs Scent Information Strong Deceivingly Group, Norman Nielsen eiinSpotSystems Support Decision xetSseswt Applications with Systems Expert rceig fte2t nulAMitrainlcneec nDesign on conference international ACM annual 25th the of Proceedings C.20,p.249–255. pp. 2007, ACM. . C.19,p.413–414. pp. 1994, ACM. . i rs,2006. Press, Mit . onWly&Sn,Ic 94 p 77–103. pp. 1994, Inc. Sons, & Wiley John . rceig fCgiieSsesEgneigi Process in Engineering Systems Cognitive of Proceedings ile omnGop 0Bs nrnt f2017 of Intranets Best 10 Group, Norman Nielsen ile omnGop 0Bs nrnt f2016 of Intranets Best 10 Group, Norman Nielsen 6(03,p.63–73. pp. (2013), 56 nl ora fHmnCmue Interaction Human–Computer of Journal Intl. 7(07,p.129–148. pp. (2017), 97 ofrnecmaino ua factors Human on companion Conference okn id:Apattoe’ ud to guide practitioner’s A minds: Working 83(01,p.2110–2118. pp. (2011), 38.3 https://www.nngroup.com/ . 1994. . C.19,p.373– pp. 1992, ACM. . rceig of Proceedings https://www. . Proceedings Inter- . . . Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. 16 .X ta.„ io td fa npcinfaeokfratmtduaiiyguideline usability automated for framework inspection an of study pilot „A al. et Xu J. [126] graph- for tools checking consistency textual and „Visual Shneiderman. B. and Mahajan R. [125] 12 .O Galitz. O. W. [132] 18 .K orai n .vnOc.„ ontv-fetv oe fPrevdUe Sat- User Perceived of Model Cognitive-Affective „A Osch. van W. and Coursaris K. C. [128] evalu- usability of effectiveness the „Enhancing Abdullah. R. M. and Soo, S.-T. Sivaji, A. [127] In: tools“. development interface metric-based toward step „A Sears. A. [124] 13 .C iv,J .Cmo,adJ .Sria GIiseto rmsuc oeanalysis“. code source from inspection „GUI Saraiva. A. J. and Campos, C. J. Silva, C. J. [133] In: tools“. development interface metric-based toward step A „AIDE: Sears. A. [131] 13 .Klk n .Vanderdonckt. J. and Kolski C. [123] 19 .Hraoe l Uigsmlto oaddcso aigi aaigteusability the managing in making decision aid to simulation „Using al. et Hurtado N. [129] 10 .O yı,A mlc n .Gos Atmtdashtcaayi fpoorpi im- photographic of analysis aesthetic „Automated Gross. M. and Smolic, A. Aydın, O. T. [120] uoaino ersi-ae sblt npcin19/128 / 119 Inspection Usability Heuristic-based of Automation Butler. J. and Holden, K. Lidwell, W. [134] Sys- Questionnaire Usability-Evaluation in Techniques Learning „Machine al. et García E. [117] Bibliography 16 .Hrcmn n .Mrh.„ate ai udatfrSfwr etAutomation“. Test Software for Quadrant Magic „Gartner Murphy. T. and Herschmann J. [156] 10 .Utn n .Legeard. B. and Utting M. [130] 19 .Nielsen. J. In: walkthrough“. [119] cognitive automated „An al. et Rieman J. [118] ainlIsiue fHealth of In: Institutes applications“. National health mobile of reviews 735. In: interfaces“. user ical UIST-95 rnilsadtechniques and principles technology software and interface User 101–110. on pp. symposium 1995, ACM annual 8th the of ings sato CMU) h opeetr fet n nedpnec fuaiiyand usability of In: interdependence design“. IS and in effects aesthetics complementary The (CAMPUS): isfaction on 48–53. Conference International pp. Third 2011, 2011 (CICSyN), Networks and Systems In: nication system“. evaluation heuristic automated by ation Inter- User of Design 280–286. Computer-Aided pp. France Valenciennes, on 2002, Conference May 15–17 International faces Fourth the of ings 2 ast nac sblt,iflec ecpin nraeapa,mk etrdesign better make design appeal, through increase teach perception, and influence decisions, usability, enhance to ways 125 In: 26–31. pp. 2010, mann, 42. vlainpoes.In: process“. evaluation gs.In: ages“. 2019-05-28. Accessed: nngroup.com/articles/response-times-3-important-limits/. ofrneo ua atr ncmuigsystems computing in factors human on conference In: tems“. n (2016). In: lcrncCmuiain fteEASST the of Communications Electronic 95 p 101–110. pp. 1995, . EEtascin nvsaiainadcmue graphics computer and visualization on transactions IEEE rceig f2dItrainlCndfrneLann ICDL Learning ference Con-d International 2nd of Proceedings ile omnGop epnetms3IpratLimits Important 3 Response-times Group, Norman Nielsen h seta ud oue nefc ein nitouto oGIdesign GUI to introduction an design: interface user to guide essential The EETascin nSfwr Engineering Software on Transactions IEEE nomto n otaeTechnology Software and Information onWly&Sn,2007. Sons, & Wiley John . rcia oe-ae etn:atosapproach tools a testing: model-based Practical nomto Management & Information C.21,p.1–8. pp. 2014, ACM. . optrAddDsg fUe nefcsII Proceed- III: Interfaces User of Design Computer-Aided okotPb 00 p 20–21. pp. 2010, Pub, Rockport . nvra rnilso ein eie n updated: and revised design, of principles Universal rceig fteWrls elh21 on 2014 Health Wireless the of Proceedings pigrSine&Bsns ei,2012, Media, Business & Science Springer . 3(2010). 33 C.19,p.427–428. pp. 1991, ACM. . opttoa nelgne Commu- Intelligence, Computational 32(06,p.252–264. pp. (2016), 53.2 rceig fteSIGCHI the of Proceedings 7(05,p.509–526. pp. (2015), 57 31 19) p 722– pp. (1997), 23.11 11(05,p.31– pp. (2015), 21.1 tp www. / / : https . ognKauf- Morgan . rceig of Proceedings (2003). 2 Proceed- ACM. . IEEE. . Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin10/128 / 120 Inspection Usability Heuristic-based of Automation aesthetics interfaces user graphical of evaluation an „Towards Vanderdonckt. In: J. and Tools“. Zen Testing M. Software Automated [163] of „Taxonomy al. et Shaukat K. Test [158] Functional Application Modern Wave: Forrester „The Mines. C. and Giudice Lo D. [157] Bibliography 19 .E irse l GEN optrbsdto o ai OSmdlusability model GOMS rapid for tool computer-based A „GLEAN: al. et Kieras E. D. [179] 10 .Salvendy. G. [190] 15 .Zpii,M haawl,adS uhl Aecnee eerhbsdwbdesign web research-based „Age-centered Mughal. S. and Ghiawadwala, M. Zaphiris, P. [195] al. et Shneiderman B. [194] Travis. D. [193] 10 .S aasy VUEacmueie igotcto:fruaiiyeauto of evaluation usability for tool: diagnostic computerised „VRUSE—a Kalawsky. S. R. [180] 19 .Karwowski. W. [189] 12 .Nielsen. J. In: interfaces“. user [192] of evaluation „Heuristic Molich. R. and Nielsen J. [191] 17 .Tatnk n .Hsezh.„run o eteisi ua-optrinterac- human-computer in aesthetics for „Arguing Hassenzahl. M. and Tractinsky N. [187] 16 .Boh n .O Lewis. O. J. and Brophy V. [186] 14 .Asmi n .A-sii Uaiiyhuitc vlainfrcidelann appli- e-learning child for evaluation heuristics „Usability Al-Osaimi. A. and Alsumait A. [184] 18 .R et n .J Landry. J. S. and Lehto R. M. [188] 11 .E cmd,Y i,adS rdaa.„epg eteis efrac n usability: and performance aesthetics, „Webpage Sridharan. S. and Liu, Y. Schmidt, E. K. [181] 15 .Cni Dsg sa lmn finvto:Eautn einepai ntechnology- in emphasis design Evaluating innovation: of element an as „Design Candi. M. [185] update“. devices: mobile touchscreen-based for heuristics „Usability al. et Inostroza R. [183] ora fCmue cec n Innovation and Science Computer of Journal ihhItrainlCneec on Conference International Eighth In: metrics“. on based Up In: 2016“. Q4 Tools, Automation 05 p 1897–1900. pp. 2005, nUe nefc n otaetechnology software and interface User In: on designs“. interface user of evaluation ICIcneec nHmnfcosi optn systems computing in factors Human on conference SIGCHI 2001. Press, udlns.In: guidelines“. interaction 2019-05-28. Accessed: resources/guidelines.html. Accessed: usability/. homepage- guidelines- design- 2019-05-28. //www.nngroup.com/articles/113- htcua design chitectural ita/ytei niomn ytm“ In: systems“. environment virtual/synthetic r rs,2012. Press, Crc in.In: tion“. 374. n e-ae plctos&Services & Applications Web-based and In: cations“. 24–29. pp. 2013, einvralsadterefcs.In: effects“. their and variables Design ae rs.In: firms“. based In: rceig fte21 hla ofrneo ua-optrInteraction Human-Computer on Conference Chilean 2013 the of Proceedings (2016). I-com ero,2016. Pearson, . srou,27wbuaiiyguidelines usability web 247 UserFocus, ile omnGop 1 eingieie o oeaeusability homepage for guidelines design 113 Group, Norman Nielsen rceig fte1t nentoa ofrneo nomto Integration Information on Conference International 11th the of Proceedings adoko ua atr n ergonomics and factors human of Handbook H’5etne btat nHmnfcosi optn systems computing in factors Human on abstracts extended CHI’05 ./05(05,p.66–68. pp. (2005), 4.3/2005 nentoa nylpdao rooisadhmnfactors human and ergonomics of encyclopedia International nentoa ora fInvto Management Innovation of Journal International oteg,2011. Routledge, . einn h sritrae taeisfrefciehuman-computer effective for strategies interface: user the Designing eerhCalne nIfrainSine(CS,21 IEEE 2014 (RCIS), Science Information in Challenges Research re irvu:picpe n rcieo utial ar- sustainable of practice and principles vitruvius: green A nrdcint ua atr n rooisfrengineers for ergonomics and factors human to Introduction h 1PoiesTa atrMs n o hyStack They How And Most Matter That Providers 11 The EE 04 p 1–12. pp. 2014, IEEE. . Ergonomics C.20,p.425–430. pp. 2009, ACM. . rceig fte8hana C symposium ACM annual 8th the of Proceedings C.19,p.91–100. pp. 1995, ACM. . ple ergonomics Applied 21) p 7–18. pp. (2015), 1 26(09,p.631–643. pp. (2009), 52.6 tp w.uefcs.c k/ uk . co . userfocus www. / / : https . onWly&Sn,2012. Sons, & Wiley John . C.19,p.249–256. pp. 1990, ACM. . 01(99,p.11–25. pp. (1999), 30.1 00 20) p 351– pp. (2006), 10.04 rceig fthe of Proceedings International o.3 Crc 3. Vol. . ACM. . ACM. . http: . . Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. 23 .Btgre l Oe onc vrwee lms tteitre csse through ecosystem internet the at glimpse A everywhere: connect „Open al. et Böttger T. [213] o lRltdWbReferences Web Tool-Related 21 .Trne l KyodBsdTsigo idw plcto“ In: Application“. Windows of Testing Based „Keyword al. et Tarun S. [211] 20 .HwetPackard. Hewlett E. [210] 27 .Dalr Teaporaino otaeeoytm rcietk nteusage, the on take practice Packard. Hewlett a E. ecosystem: software a [208] of appropriation „The Draxler. S. [207] 25 .B ilr Rsos iei a-optrcnestoa rnatos.In: transactions“. conversational man-computer in time „Response Miller. B. R. [205] 24 .Nielsen. J. [204] 23 .Nielsen. J. [203] Net. R. S. M. [202] 19 .J a ere n .A a ik UigteItre:Silrltdpolm nusers’ in problems related Skill Internet: the „Using Dijk. Van A. J. and Deursen Van J. A. [199] 18 .Pre,S do n .Mosseri. A. and Odio, S. Parker, S. [198] 20 .Lb ta.„e sblt udlnsfrsathns yegcapoc“ In: approach“. synergic a smartphones: for guidelines usability „Web al. et Lobo D. [200] 21 .Gui.„h aeaantue nefc ossec“ In: consistency“. interface user against case „The Grudin. J. [201] 17 .Nielsen. J. [197] uoaino ersi-ae sblt npcin11/128 / 121 Inspection Usability Heuristic-based of Automation In: heuristics“. usability establish to methodology „A al. et Rusu C. [196] Bibliography 1 .Banaouas. A. [1] [6] [4] [5] Bioinformatics nri tdo h fca D o Android for IDE Official The Studio, Android ranorex.com. h eso h eflxcn.In: cdn“. netflix the of lens the android.com/studio/index.html. Sequencing Event GUI with Evaluation Heuristic 28–34. pp. (2018), ou,2018. Focus, eeimQ rwe Automation Browser SeleniumHQ, //github.com/Amir-DA/automated-usability-inspection}. aoe,Ts uoainfrGITesting GUI for Automation Test Ranorex, org. irsFcs 2018. Focus, Micros aneac n oicto fteelpeIE.I:(2015). In: IDE“. eclipse the of modification and I maintenance part conference, computer joint fall 1968, 9-11, 267–277. pp. December the of ings nru.o/rilsdrblt-fuaiiygieie/ cesd 2019-05-28. Accessed: nngroup.com/articles/durability-of-usability-guidelines/. nru.o/rilssxygieie-rm18-eiie/ cesd 2019-05-28. Accessed: nngroup.com/articles/sixty-guidelines-from-1986-revisited/. Technologies 1164–1173. pp. (1989), 32.10 engineering electronics and information of journal national niebhvo“ In: behavior“. online /w.nru.o/rilsie-itve-l/ cesd 2019-05-28. Accessed: //www.nngroup.com/articles/item-list-view-all/. ofrne nAvne nCmue-ua neatos(CI21) IARIA 2011), (ACHI Interactions 59–62. Computer-Human pp. in Advances on Conferences ile omnGop sr’Pgnto rfrne n Ve All" "View and Preferences Pagination Users’ Group, Norman Nielsen ile omnGop it udlnsFo 96Revisited 1986 From Guidelines Sixty Group, Norman Nielsen ile omnGop uaiiyo sblt Guidelines Usability of Durability Group, Norman Nielsen tp/wwntakthr.o/ cesd 2019-05-28. Accessed: http://www.netmarketshare.com/. . eko prtn ytmMre hr,Mre hr ttsisfrInternet for Statistics Share Market Share, Market System Operating Desktop iHb opeeSuc oefraPatclEprmn nAutomated on Experiment Practical a for Code Source Complete GitHub, . 21) p 92–96. pp. (2016), 8.4 nfidFntoa etn uoil otaeVrin14.03 Version Software Tutorial, Testing Functional Unified nfidFntoa etn d-n ud,Sfwr eso 14.03 Version Software Guide, Add-ins Testing Functional Unified neatn ihcomputers with Interacting C ICM optrCmuiainReview Communication Computer SIGCOMM ACM cesd 2019-05-27. Accessed: . nnt Scrolling Infinite cesd 2019-05-27. Accessed: . cesd 2019-05-27. Accessed: . 156(09,p.393–402. pp. (2009), 21.5-6 cesd 2019-05-28. Accessed: . SPtn p.1/3,0.2010. 12/833,901. App. Patent US . URL . 21) .33. p. (2011), 1.1 omnctoso h ACM the of Communications https://www.seleniumhq. : rc t International 4th Proc. URL URL tp www. / / : https . https://developer. : https://www. : URL imtisand Biometrics https://www. . C.1968, ACM. . \url{https: : Proceed- Micros . 2011, . https: . Inter- 48.1 . Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. [147] [148] [142] [141] [149] [122] uoaino ersi-ae sblt npcin12/128 / 122 Inspection Usability Heuristic-based of Automation [143] [109] Bibliography [150] [146] [135] [144] 12 .Rauterberg. M. [112] [151] [145] [152] [137] [136] [121] [153] [140] [138] [139] aaot,WbU Testing UI Web , Parasoft de/products/silk-portfolio/silk-test/. qihGITse,AtmtdGITsigTool Testing GUI Automated Tester, GUI Squish capability/web-ui-testing/. avrx eti simple it Test Marveryx, eei etSui otaeTsigTools Testing Software , Studio Test Telerik www.froglogic.com/squish/. rpzn,CnhaSy Portal Says Cynthia Cryptzone, com/. FTs sAie U etn olfrJv n h Web the and Java for Tool Testing GUI Agile, is QF-Test //www.oracle.com/technetwork/oem/app-test/index.html. etCmlt uoae otaeTsigMd Simple Made Testing Software Automated , Complete Test telerik.com/teststudio. ir ou,Sl Test Silk Focus, Micro //sahipro.com/. AscentialTest? is What Zeenyx, com/. B,Rtoa ucinlTester Functional Rational IBM, https://www.qfs.de/en.html. rcni,TsaAtmt UI Automate Tosca Tricentis, https://smartbear.com/product/testcomplete/overview/. ai uoainTsigTo o e Applications Web for Tool Testing Automation Sahi, en/marketplace/rational-functional-tester. nfidFntoa etn uoae etn Software Testing Automated , Testing Functional Unified resource-assets/tosca-automate-ui/. Toolkit Dojo , DOJO autoit. AutoIt Script, It Auto AscentialTest.html. av,VrySOto n piiainguide optimization and tool SEO Varvy Varvy, idemployee.id.tue.nl/g.w.m.rauterberg/amme.html. iulSui etPoesoa,Sfwr etn o professional for testing Software Professional, Test Studio Visual testing/ automated- functional- us/products/unified- overview. https://www.microfocus.com/en- sreig sblt etn rmUserTesting from testing Usability UserTesing, usertesting.com. Mco e rwe citn aaEtato n e etn nisic project ipswitch An Testing Web and Extraction 2019-05-27. Data Accessed: Scripting Browser Web , iMacros io/. URL gPat,Dlvrn reTs Automation Test True Delivering , EggPlant rceEters aae,ApiainTsigSuite Testing Application Manager, Entreprise Oracle reeko.r,lcdp FreeDesktop.org, https://visualstudio.microsoft.com/de/vs/test-professional/. : ME oepg fAMME of page Home AMME, cesd 2019-05-27. Accessed: . cesd 2019-05-27. Accessed: . cesd 2019-05-27. Accessed: . cesd 2019-05-27. Accessed: . URL cesd 2019-05-27. Accessed: . cesd 2019-05-27. Accessed: . https://imacros.net/. : cesd 2019-05-27. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-27. Accessed: cesd 2019-05-27. Accessed: . cesd 2019-05-27. Accessed: . cesd 2019-05-27. Accessed: . URL cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . URL URL URL URL cesd 2019-05-27. Accessed: . https://dojotoolkit.org/. : https://www.autoitscript.com/site/ : https://ldtp.freedesktop.org/wiki/. : https://www.microfocus.com/de- : cesd 2019-05-27. Accessed: . URL https://www.maveryx.com/. : cesd 2019-05-27. Accessed: . URL URL tp w.prsf o / com . parasoft www. / / : https : URL URL cesd 2019-05-27. Accessed: . cesd 2019-05-27. Accessed: . cesd 2019-05-27. Accessed: . https://www.tricentis.com/ : https://www.zeenyx.com/ : https://www.ibm.com/us- : http://www.cynthiasays. : cesd 2019-05-27. Accessed: . URL URL URL URL URL https://eggplant. : https://www. : https://varvy. : https://www. : URL http://www. : URL URL https:// : https: : https: : URL URL URL : : : . Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin13/128 / 123 Inspection Usability Heuristic-based of Automation [154] Bibliography [155] [159] [160] [161] [162] [164] [170] [165] [166] [168] [171] [169] [174] [173] [172] [167] [175] [176] [177] [178] [182] ai rjc,WbApiainTsigi Ruby in Testing Application Web Project, Watir N ne eodadRpa o iu n te 1 ae Systems Based X11 Other and Linux 05-27. for Replay and Record Xnee, GNU com/. nfidFntoa etn etrs uoae etn Software Testing Automated Features, Testing Functional Unified //www.all4tec.com/matelo. pltos uoae iulTesting Visual Automated Applitools, iui,AtmtdAyhn o See You Anything Automated SikuliX, com/. okak ipeadPwru srResearch User Powerful and Simple LookBack, com/helio/. Analytics App Mobile Understand Appsee, ehmt,UaiiyTsigwt Morae with Testing Usability TechSmith, com/. osSas,U nlssSuite Analysis UX , MouseStats techsmith.com/morae.html. Tool Testing User Online 11, Loop lookback.io/. Desingers UX for Testing Impression First Attensee, appsee.com/. rwSga nieSre Software Survey Online , CrowdSignal //www.attensee.com/. rMU est sblt Testing Usability Website , TryMyUI Easy Made Mac the on Testing Usability Guerrilla , Silverback Testing and Prototyping through 28. Navigation easier Create , Naview com/. ei ail eeln e srBehaviors User Key Revealing Rapidly , Helio com/. sbla,TeSadr srFeedback User Standard The , Usabilla com/. htsrD sblt n srTesting User and Usability , WhatUsersDo com/. opa elTm utmrAnalytics Customer Time Real Woopra, whatusersdo.com/. com/. Terx est pe n efrac Optimization Performance and Speed Website GTmetrix, //www.techsmith.com/morae.html. https://gtmetrix.com/. URL URL aa sblt etn ihMreb TechSmith by Marae with Testing Usability , Marae aeoitgain aeoPui yAll4Tech by Plugin MaTeLo , integrations MaTeLo URL https://software.microfocus.com/en-us/software/uft/features. : https://silverbackapp.com/. : URL https://www.naviewapp.com/. : https://xnee.wordpress.com/. : cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-27. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . URL cesd 2019-05-27. Accessed: . URL URL https://www.mousestats. : URL URL URL URL cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . https://www.loop11. : https://www.trymyui. : http://sikulix.com/. : URL https://www.woopra. : https://crowdsignal. : cesd 2019-05- Accessed: . tp pltos. applitools / / : https : URL URL URL https://usabilla. : cesd 2019- Accessed: . URL URL tp www. / / : https : https://www. : https://www. : URL https://zurb. : http://watir. : URL URL URL tp / / : https : https: : http: : http: : URL : Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin14/128 / 124 Inspection Usability Heuristic-based of Automation [206] Bibliography 29 License. [209] [212] //store.steampowered.com/. https://www.eclipse.org/. ta ecm oSemByoc lyeverywhere Play once Buy Steam to Welcome , Steam cis nbigOe noainadCollaboration and Innovation Open Enabling , Eclipse N eea ulc resfwr foundation software Free Public, General GNU cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . cesd 2019-05-28. Accessed: . URL https: : URL : Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin15/128 / 125 Inspection Usability Heuristic-based of Automation below: viewed the be in can appendix figures this The Every in to. offered correspond appendix. is they this 5.3.2 heuristics section the in in as presented included order are 5.3.2). non same section was guidelines in diagram those flow described for whose (as diagrams guideline heuristics flow evaluating instances UFT test additional practical Some the to comes it When of Diagrams instances evaluation Flow UFT heuristic Additional automated A Bibliography iueA.1: Figure F o iga o udln ./5"xlctTbigt aaFields" Data to Tabbing "Explicit 1.4/15 guideline for diagram flow UFT Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin16/128 / 126 Inspection Usability Heuristic-based of Automation Bibliography iueA.2: Figure iueA.3: Figure F o iga o udln ./ NnDsutv ro Messages" Error "Non-Disruptive 1.7/3 guideline for diagram flow UFT F o iga o udln ./ Feil euneControl" Sequence "Flexible 3.0/1 guideline for diagram flow UFT Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin17/128 / 127 Inspection Usability Heuristic-based of Automation Bibliography iueA.4: Figure iueA.5: Figure F o iga o udln ../ Mn eeto yKydEntry" Keyed by Selection "Menu 3.1.3/7 guideline for diagram flow UFT F o iga o udln ./ Cne Option" "Cancel 3.3/3 guideline for diagram flow UFT Die approbierte gedruckte Originalversion dieser Diplomarbeit ist an der TU Wien Bibliothek verfügbar. The approved original version of this thesis is available in print at TU Wien Bibliothek. uoaino ersi-ae sblt npcin18/128 / 128 Inspection Usability Heuristic-based of Automation Bibliography iueA.7: Figure iueA.6: Figure F o iga o udln ./8"srCnraino etutv Actions" Destructive of Confirmation "User 6.0/18 guideline for diagram flow UFT F o iga o udln ./0"NOt ees oto Actions" Control Reverse to "UNDO 3.5/10 guideline for diagram flow UFT