Research Design

Total Page:16

File Type:pdf, Size:1020Kb

Research Design ©Thomas Plümper 2007/08 Research Design Thomas Plümper Department of Government University of Essex [email protected] ©Thomas Plümper 2007/08 Organization Lesson 1: Why Research Designs? Lesson 2: What is a good Research Question? Lesson 3: Basic Concepts, Discussions and Axioms in the Philosophy of Science Lesson 4: Causality Lesson 5: Theory Formation Lesson 6: Theory and Empirical Analysis Lesson 7: Qualitative Research Designs and Case Studies Lesson 8: Quantitative Research Designs Lesson 9: Robustness ©Thomas Plümper 2007/08 Lesson 10: Efficient Writing: A Summary of Issues Lesson 11: ©Thomas Plümper 2007/08 Lesson 1: Why Research Design? ©Thomas Plümper 2007/08 What is Science? “Science is a public process. It uses systems of concepts called theories to help interpret and unify observation statements called data; in turn the data are used to check or ‘test’ the theories. Theory creation may be inductive, but demonstration and testing are deductive, although, in inexact subjects, testing will involve statistical inference. Theories that are at once simple, general and coherent are valued as they aid productive and precise scientific practice.” David F. Hendry 1980 ©Thomas Plümper 2007/08 Why Research Design? Axiom 1: A research design is good, if and only if it allows researchers to draw valid inferences. btw: An axiom is a sentence or proposition that is not proved or demonstrated and is considered as self- evident or as an initial necessary consensus for a theory building or acceptation. Nevertheless, axiom 1 leads to two questions: 1. Why should scientists be interested in valid inferences? 2. Can we proof that inferences are valid? ©Thomas Plümper 2007/08 Why should scientists be interested in valid inferences? One step back: What are scientists interested in? ©Thomas Plümper 2007/08 Why (social) scientists should be interested in valid inferences 1 One step back: What are scientists interested in? − maximizing life-time utility (which is a function of income, social status, 1-uncertainty, and so on) − getting tenure − getting cited − publications in a certain type of journals (a book with a very good publisher) or, perhaps more seriously but certainly on a lower plane: − explanations − generalizations − simplifications or, in short − advance scientific knowledge (we repeatedly come back to this statement) ©Thomas Plümper 2007/08 Why (social) scientists should be interested in valid inferences 2 Axiom 2 Eventually, (social) scientists are interested in theories, which are simultaneously as simple and as general as possible. It follows: 1. A simpler theory is better than a more complicated theory which does not explain more. 2. An equally simple theory that explains more is better than a theory that explains less. 3. A more complicated theory that explains more is not per se better than a less complicated theory that explains less. ‘More’ means more cases, more phenomena, … ©Thomas Plümper 2007/08 Why (social) scientists should be interested in valid inferences 3 We can see how complicated a theory is when we see one (or compare it to other theories). We cannot see how valid the generalizations are that the theory makes. Thus: (Social) scientists need to develop theories and test them (test generalizations of the theory). BUT: keep in mind that theories need to simplify. Thus, testing theories means testing whether the prediction of the theory are correct, not whether the assumptions are ‘true’ (or whatsoever). Hence Axiom 3: Valid inferences are a necessary condition for an appropriate test of a theory. ©Thomas Plümper 2007/08 And back to research design: Axiom 1 reformulated: A good research design is one that allows making valid inferences and thus is a necessary condition for an appropriate test of a theory. ©Thomas Plümper 2007/08 Lesson 2: What is a good Research Question Axiom 4: A good research question is one that leads to a theory, which has an ex ante probability of being correct close to 0.5. ‘Correct’ here means: the theory simplifies reality in a way that leads to generalizations which help understanding many real world phenomena. ©Thomas Plümper 2007/08 Why app. 0.5? Given that research should increase (or foster) the visibility of the researcher: A theory which has a prior probability of finding empirical support close to 1.0 is trivial. A theory which has a prior probability of finding empirical support close to 0.0 is risky. Again, keep in mind that researchers test the predictions of a theory, not the assumptions. Do we know prior probabilities? Of course, just ask your colleagues whether they think your hypotheses are correct. ©Thomas Plümper 2007/08 Falsification and Falsifiability Karl Popper (1963): Theories must be falsifiable. Thus, the words ‘may’, ‘could’, ‘should’ and so on shall not be used in theories. If some action or effect is conditional, make the conditionality explicit, if it takes place only with a certain probability, make this clear. Imre Lakatos (1973): “The demarcation between science and pseudoscience is not merely a problem of armchair philosophy, it is of vital social and political relevance.” David Hume (1748): “Let us ask: does it [any volume] contain any abstract reasoning concerning quantity or numbers? (…) No. Commit it then to the flames.” ©Thomas Plümper 2007/08 Falsifiability Popper uses the term falsifiability with two different meanings: 1. Falsifiability is a logical property of statements which requires that scientific statements logically imply at least one testable prediction. 2. Falsifiability is a normative construct, telling scientists that an test of a theory should try to refute it. There is no relevant dissent with the first meaning, but the prescriptive meaning has let to huge controversies. I use the term in the first sense and will explain why ‘naïve simplification’ does not lead to scientific progress. ©Thomas Plümper 2007/08 On Naïve Falsification Thomas Kuhn and Imre Lakatos: Abandoning a theory the instant it makes false predictions would eliminate too much good research. Well, yes, but the main point is that The huge majority of (social science) theories is not deterministic but probabilistic. This implies that we cannot falsify a theory in Popper’s sense. Rather, we have to show that on average the theory’s predictions are wrong. Thomas Kuhn (1962): Actual scientists do not refute a theory simply because it makes false predictions (or even worse: one false prediction). But let’s not talk about paradigmatic change and scientific revolutions here… ©Thomas Plümper 2007/08 ‘Proving’ theories right? Hume and Popper and Lakatos and in fact about everyone agree that scientists cannot prove a theory to be right. This made Hume and Popper stress that scientists need to try to prove a theory wrong. Lakatos, however, claims that science is not a competition between theories. For Lakatos, research programs compete. Research programs can either be progressive or degenerating: progressive research programs continue to predict novel facts, degenerating research programs fabricate theories in order to accommodate known facts. But let’s repeat and keep in mind: Verification of truth is logically impossible. ©Thomas Plümper 2007/08 Can we test probabilistic theories? ©Thomas Plümper 2007/08 Can probabilistic theories be tested? Of course, but scientists need to socially agree on a certain threshold which tells us when empirical evidence ‘too much’ contradicts the probabilistic predictions derived from a theory. ©Thomas Plümper 2007/08 Probabilistic theories and verification “Verification is logically impossible.” (page 13) Yet, when studying the empirical relevance of probabilistic theories, we get a more complex result (the statement still remains correct as we will see). Let’s turn to Bayesian Philosophy of Science … Bayesians contend that confirmation is quantitative: Evidence E confirms a hypothesis H if and only if it raises the probability of H: p(H|E)>p(H) read: the probability of H (being correct) given empirical evidence E is larger than the probability of H prior to the presentation of empirical evidence E. We refer to these probabilities as prior and posterior probability. ©Thomas Plümper 2007/08 Bayes’s Theorem Axioms 1) Every probability is a real number between 0 and 1, 0≤p ( A ) ≤ 1. 2) If A is necessarily true, then p(A)=1. Note: we cannot prove p(A)=1 but we can believe it. 3) If A and B are mutually exclusive (it is not possible that statements A and B are both true), then p()()(() A∨ B = p A + p B 4) p(¬ A ) = 1 − p ( A ) 5) If A logically entails B, then p()() B≥ p A . 6) p()()()(&) A∨ B = p A + p B − p A B if A and B are mutually exclusive we get 3. Bayes’s theorem in its simplest form p( A | B )⋅ p ( B ) p( B | A ) = p() A ©Thomas Plümper 2007/08 Bayes and Advancement in Scientific Knowledge A more complex form of Bayes’s theorem: p( E| T)⋅ p( T ) p( T| E) = p( E| T)⋅ p( T) + p( E |∼ T) ⋅ p( ∼ T ) Read: The probability that a theory T is correct given some evidence E is a function of − the prior probability that T is correct, − the expectedness of evidence E, − the posterior probability of T. If a theory is deterministic, then p(E|T)=1. If a theory is probabilistic, then p(E|T)<1. ©Thomas Plümper 2007/08 Problems p(E), p(T) and p(~T)are difficult to determine. p(E) might be biased upward due to old evidence. p(~T) would be rival theories, which may or may not exist. p(T) may vary across researchers (early Bayesians suggested to derive p(T) from observed behavior). ©Thomas Plümper 2007/08 Choosing research questions according to Bayes Scientist can ‘confirm’ theories in a Bayesian world. Confirmation, however, just means that new evidence increases the (inter-) subjective probability if T.
Recommended publications
  • Jackson: Choosing a Methodology: Philosophical Underpinning
    JACKSON: CHOOSING A METHODOLOGY: PHILOSOPHICAL UNDERPINNING Choosing a Methodology: Philosophical Practitioner Research Underpinning In Higher Education Copyright © 2013 University of Cumbria Vol 7 (1) pages 49-62 Elizabeth Jackson University of Cumbria [email protected] Abstract As a university lecturer, I find that a frequent question raised by Masters students concerns the methodology chosen for research and the rationale required in dissertations. This paper unpicks some of the philosophical coherence that can inform choices to be made regarding methodology and a well-thought out rationale that can add to the rigour of a research project. It considers the conceptual framework for research including the ontological and epistemological perspectives that are pertinent in choosing a methodology and subsequently the methods to be used. The discussion is exemplified using a concrete example of a research project in order to contextualise theory within practice. Key words Ontology; epistemology; positionality; relationality; methodology; method. Introduction This paper arises from work with students writing Masters dissertations who frequently express confusion and doubt about how appropriate methodology is chosen for research. It will be argued here that consideration of philosophical underpinning can be crucial for both shaping research design and for explaining approaches taken in order to support credibility of research outcomes. It is beneficial, within the unique context of the research, for the researcher to carefully
    [Show full text]
  • Applications of Systems Engineering to the Research, Design, And
    Applications of Systems Engineering to the Research, Design, and Development of Wind Energy Systems K. Dykes and R. Meadows With contributions from: F. Felker, P. Graf, M. Hand, M. Lunacek, J. Michalakes, P. Moriarty, W. Musial, and P. Veers NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency & Renewable Energy, operated by the Alliance for Sustainable Energy, LLC. Technical Report NREL/TP-5000-52616 December 2011 Contract No. DE -AC36-08GO28308 Applications of Systems Engineering to the Research, Design, and Development of Wind Energy Systems Authors: K. Dykes and R. Meadows With contributions from: F. Felker, P. Graf, M. Hand, M. Lunacek, J. Michalakes, P. Moriarty, W. Musial, and P. Veers Prepared under Task No. WE11.0341 NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency & Renewable Energy, operated by the Alliance for Sustainable Energy, LLC. National Renewable Energy Laboratory Technical Report NREL/TP-5000-52616 1617 Cole Boulevard Golden, Colorado 80401 December 2011 303-275-3000 • www.nrel.gov Contract No. DE-AC36-08GO28308 NOTICE This report was prepared as an account of work sponsored by an agency of the United States government. Neither the United States government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States government or any agency thereof.
    [Show full text]
  • Choosing a Mixed Methods Design
    04-Creswell (Designing)-45025.qxd 5/16/2006 8:35 PM Page 58 CHAPTER 4 CHOOSING A MIXED METHODS DESIGN esearch designs are procedures for collecting, analyzing, interpreting, and reporting data in research studies. They represent different mod- R els for doing research, and these models have distinct names and procedures associated with them. Rigorous research designs are important because they guide the methods decisions that researchers must make dur- ing their studies and set the logic by which they make interpretations at the end of studies. Once a researcher has selected a mixed methods approach for a study, the next step is to decide on the specific design that best addresses the research problem. What designs are available, and how do researchers decide which one is appropriate for their studies? Mixed methods researchers need to be acquainted with the major types of mixed methods designs and the common variants among these designs. Important considerations when choosing designs are knowing the intent, the procedures, and the strengths and challenges associated with each design. Researchers also need to be familiar with the timing, weighting, and mixing decisions that are made in each of the different mixed methods designs. This chapter will address • The classifications of designs in the literature • The four major types of mixed methods designs, including their intent, key procedures, common variants, and inherent strengths and challenges 58 04-Creswell (Designing)-45025.qxd 5/16/2006 8:35 PM Page 59 Choosing a Mixed Methods Design–●–59 • Factors such as timing, weighting, and mixing, which influence the choice of an appropriate design CLASSIFICATIONS OF MIXED METHODS DESIGNS Researchers benefit from being familiar with the numerous classifications of mixed methods designs found in the literature.
    [Show full text]
  • 3 Research Philosophy and Research Design
    Research Philosophy 3 and Research Design Introduction In the introductory chapter, developing self-awareness was a key pro- cess outlined and it was stated that it is possible you have assumed that the way you view the world is the same as the way that everybody else views the world. The term ‘common sense’ was used in this discussion. We noted then, you could believe it is common sense that the way you look at the world is the same way that others look at it. However, we also saw earlier that one person’s common sense is not necessarily the same as another’s! If we accept that there are likely to be differences between people’s view of the world, it may not come as a surprise that the way some researchers view the world, is very different from other’s views. Research philosophies The idea that there are different views of the world, and the processes that operate within it, is part of what is known as philosophy. Philosophy is concerned with views about how the world works and, as an academic subject, focuses, primarily, on reality, knowledge and existence. Our individual view of the world is closely linked to what we perceive as reality. On a day-to-day basis outside of your academic work, it would be unusual to think often about the way you perceive reality and the world around you. However, in relation to your dissertation, it is very important to realise how you perceive reality. Your individual percep- tion of reality affects how you gain knowledge of the world, and how you act within it.
    [Show full text]
  • Fashion Designers' Decision-Making Process
    Iowa State University Capstones, Theses and Graduate Theses and Dissertations Dissertations 2013 Fashion designers' decision-making process: The influence of cultural values and personal experience in the creative design process Ja-Young Hwang Iowa State University Follow this and additional works at: https://lib.dr.iastate.edu/etd Part of the Art and Design Commons Recommended Citation Hwang, Ja-Young, "Fashion designers' decision-making process: The influence of cultural values and personal experience in the creative design process" (2013). Graduate Theses and Dissertations. 13638. https://lib.dr.iastate.edu/etd/13638 This Dissertation is brought to you for free and open access by the Iowa State University Capstones, Theses and Dissertations at Iowa State University Digital Repository. It has been accepted for inclusion in Graduate Theses and Dissertations by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. Fashion designers’ decision-making process: The influence of cultural values and personal experience in the creative design process by Ja -Young Hwang A dissertation submitted to the graduate faculty in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Major: Apparel, Merchandising, and Design Program of Study Committee: Mary Lynn Damhorst, Co-Major Professor Eulanda Sanders, Co-Major Professor Sara B. Marcketti Cindy Gould Barbara Caldwell Iowa State University Ames, Iowa 2013 Copyright © Ja Young Hwang, 2013. All rights
    [Show full text]
  • CBPR Charrette Fact Sheet
    CBPR Charrette Fact Sheet CBPR Charrettes: An Expert Guided Process for Strengthening Community-Academic Partnerships University of North Carolina at Chapel Hill (UNC) Project Background The NC Translational and Clinical Sciences Institute (TraCS), home of UNC’s Clinical and Translational Science Award (CTSA) and the UNC Center for Health Promotion and Disease Prevention (HPDP), home of UNC’s CDC-funded Prevention Research Center (PRC), received two supplement grants to launch Community Leadership and Reciprocal Development: Advancing Community-Engaged Research at Two CTSA Institutions in collaboration with Vanderbilt’s Institute for Clinical and Translational Research (VICTR). The goal of this project was to expand and accelerate both institutions’ capacity to advance community-engaged research, by 1) drawing on the expertise of community partners working with our respective institutions, and, 2) initiating a cross-institution partnership to share expertise, develop resources, and disseminate new knowledge and approaches. At UNC we provide technical assistance through a “charrette” process to advance the adoption and successful implementation of community- based participatory research (CBPR) among community and academic partnership teams. What is a charrette? A “charrette” is a collaborative planning process most often used in design and architecture that harnesses the talents and energies of all interested parties to create and support a feasible plan to bring about community development and transformation (www.nationalcharretteinstitute.org). This project adapts the charrette concept to build the capacity of community-academic partnerships to use a CBPR approach to address the health needs of communities. Our “CBPR Charrettes” are comprised of focused guidance sessions for community-academic partnership (CAP) teams to address issues or questions identified by the partnership.
    [Show full text]
  • RESEARCH DESIGN Qualitative, Quantitative, and Mixed Methods Approaches
    THIRD EDITION RESEARCH DESIGN Qualitative, Quantitative, and Mixed Methods Approaches JOHNW. CRESWELL UNIVERSITY OF NEBRASKA-LlNCOLN ~SAGE Los Angeles • London • New Delhi • Singapore Copyright © 2009 by SAGEPublicatíons, Inc. Al! rights reserved. No part of this book may be reproduced or utilized in any form or by any means, electronic or mechanícal, including photocopying. recording. or by any information storage and retrieval system, without permission in writing from the publisher. Por iniormation: SAGEPublications. Inc. SAGEPublications India Pvt. Ltd. 2455 Tel!er Road B 1/1 1 Mohan Cooperative Thousand Oaks, California 91320 Industrial Area E-mail: [email protected] Mathura Road, New Delhi 110 044 India SAGEPublications Ltd. SAGEPublications Asia-Pacific Pte. Ltd. 1 Oliver's Yard 33 Pekin Street #02-01 55 City Road Far East Square London ECl y ISP Singapore 048763 United Kingdom Printed in the United States of America Library of Congress Cataloqinq-in-Publicaticn Data Creswell,[ohn W. Research design: Qualítative, quantitative. and mixed methods approaches/john W. Creswell. -3rd ed. p.cm. Includes bibliographical references and index. ISBN 978-1-4129-6556-9 (cloth) ISBN 978-1-4129-6557-6 (pbk.) l. Social sciences-Research-Methodology. 2. Social sciences-Statistical methods. 1. Title. H62.C6963 2009 300.n-ac22 2008006242 Prinied on acid-free paper 08 09 10 11 12 10 9 8 7 6 5 4 3 2 1 Acquiring Editor: Vicki Knight Associate Editor: Sean Connelly Editorial Assistant: Lauren Habib ;. Production Editor: Sarah K. Quesenberry Copy Editor: Marilyn Power Scott Typesetter: C&MDigitals (P) Ltd. Proofreader: Marleis Roberts Indexer: RickHurd Cover Designer: [anet Foulger Marketing Manager: Stephanie Adams Brief Contents Analytic Contents of Research Techniques xv Preface xix Acknowledgments xxvii About the Author xxix ~ Part 1:Preliminary Considerations 1.
    [Show full text]
  • Philosophy of Design: a Meta- Theoretical Structure for Design Theory
    Philosophy of design: a meta- theoretical structure for design theory Terence Love, Praxis Education, 21 Spiers Rd, Quinns Rocks, Western Australia This paper focuses on the structure and dynamic of theory in design research. Problems with existing theory are explored, and a new meta- theoretical method is suggested for assisting the critical analysis, comparison and formulation of design theories and concepts. This meta- theoretical method contributes to building a simplifying paradigm of design research by providing a means to clarify the existing state of design theory in the field, to assist with the establishment of coherence and compatibility between concepts in disparate theories, to validate theory and concepts, and to uncover ‘hidden’ aspects of design theories. 2000 Elsevier Science Ltd. All rights reserved Keywords: philosophy of design, design theory, epistemology, ontology, design philosophy his paper is a contribution to the study of Philosophy of Design. In it, a critical overview is taken of the structure and dynamic of Tdesign theory. This overview leads to a method to assist with the critical analysis of design theories and their associated concepts. If we consider ‘design theory’ as an artefact to be designed, this paper’s role is in the part of the design process often called ‘problem analysis’. Under- pinning what is presented in this paper is an assumption that the act of designing by human agents is central to the academic study of design. This theoretical standpoint, that what is described as design always implicates humans, together with the understanding that any theories, theorising or theory interpretation has meaning only in a human context, is the basis for the analyses presented below.
    [Show full text]
  • Value Sensitive Design of Unmanned Aerial Systems As ”The Main Research Questions Are ’How’ Or ’Why’”, and ”The Focus of the Study Is a Contemporary Phenomenon” [85]
    Supervisors: Author: Alf Rehn Dylan Cawthorne Marianne Harbo Frederiksen Douglas Cairns Contents List of Figures 7 List of Tables 13 Acronyms 14 1 ABSTRACT IN ENGLISH 16 2 RESUMEP´ A˚ DANSK 17 3 DEDICATION 18 4 ACKNOWLEDGEMENTS 19 5 INTRODUCTION 20 5.1 Drones . 20 5.2 Value Sensitive Design . 22 5.3 Research gap . 22 5.4 Reading guide . 24 5.4.1 Overview of Case Studies . 24 5.4.2 Overview of Papers . 25 5.5 Context . 28 5.6 Scope . 28 5.6.1 Within scope . 28 5.6.2 Outside of scope . 29 5.7 Audience . 30 6 METHODOLOGY 31 6.1 Philosophical world view - pragmatism . 31 6.2 Research approach - mixed methods . 32 6.3 Research design - abduction . 32 6.4 Research strategy - experimental and non-experimental . 33 6.5 Research methods - various . 33 7 CASE STUDY I - THE HUMANITARIAN CARGO DRONE IN PERU 35 1 8 PAPER 1 - Value Sensitive Design of a Humanitarian Cargo Drone 36 8.1 Introduction . 36 8.1.1 Background . 37 8.1.2 Embodied values . 37 8.1.3 Technology in a social context . 37 8.1.4 Non-epistemic values . 38 8.1.5 Prior Art . 38 8.2 Value Sensitive Design Methdology . 38 8.2.1 The three phases of VSD . 38 8.2.2 Conceptual phase . 38 8.2.3 Empirical phase . 39 8.2.4 Technological phase . 39 8.2.5 Interactions . 41 8.2.6 Iteration . 41 8.2.7 Basis of analysis . 41 8.2.8 Applications . 41 8.3 Retrospective Analysis .
    [Show full text]
  • Research Design Construction the Research Design Is a Framework for Planning Your Research and Answering Your Research Questions
    Research Design Construction The research design is a framework for planning your research and answering your research questions. Creating a research design means making decisions about: The type of data you need The location and timescale of the research The participants and sources The variables and hypotheses (if relevant) The methods for collecting and analyzing data The research design sets the parameters of your project: it determines exactly what will and will not be included. It also defines the criteria by which you will evaluate your results and draw your conclusions. The reliability and validity of your study depends on how you collect, measure, analyze, and interpret your data. A strong research design is crucial to a successful research proposal, scientific paper, or dissertation. Table of contents 1. Consider your priorities and practicalities 2. Determine the type of data you need 3. Decide how you will collect the data 4. Decide how you will analyze the data 5. Write your research proposal Step 1: Consider your priorities and practicalities For most research problems, there is not just one possible research design, but a range of possibilities to choose from. The choices you make depend on your priorities in the research, and often involve some tradeoffs – a research design that is strong in one area might be weaker in another. A qualitative case study is good for gaining in-depth understanding of a specific context, but it does not allow you to generalize to a wider population. A laboratory experiment allows you to investigate causes and effects with high internal validity, but it might not accurately represent how things work in the real world (external validity).
    [Show full text]
  • What Is Research Design? 1
    PART I WHAT IS RESEARCH DESIGN? 1 THE CONTEXT OF DESIGN Before examining types of research designs it is important to be clear about the role and purpose of research design. We need to understand what research design is and what it is not. We need to know where design ®ts into the whole research process from framing a question to ®nally analysing and reporting data. This is the purpose of this chapter. Description and explanation Social researchers ask two fundamental types of research questions: 1 What is going on (descriptive research)? 2 Why is it going on (explanatory research)? Descriptive research Although some people dismiss descriptive research as `mere descrip- tion', good description is fundamental to the research enterprise and it has added immeasurably to our knowledge of the shape and nature of our society. Descriptive research encompasses much government spon- sored research including the population census, the collection of a wide range of social indicators and economic information such as household expenditure patterns, time use studies, employment and crime statistics and the like. Descriptions can be concrete or abstract. A relatively concrete descrip- tion might describe the ethnic mix of a community, the changing age pro®le of a population or the gender mix of a workplace. Alternatively 2 WHAT IS RESEARCH DESIGN? the description might ask more abstract questions such as `Is the level of social inequality increasing or declining?', `How secular is society?' or `How much poverty is there in this community?' Accurate descriptions of the level of unemployment or poverty have historically played a key role in social policy reforms (Marsh, 1982).
    [Show full text]
  • Identifying Indicators Related to Constructs for Engineering Design Outcome
    Journal of Technology Education Vol. 27 No. 2, Spring 2016 Identifying Indicators Related to Constructs for Engineering Design Outcome Cheryl A. Wilhelmsen & Raymond A. Dixon Abstract This study ranked constructs articulated by Childress and Rhodes (2008) and identified the key indicators for each construct as a starting point to explore what should be included on an instrument to measure the engineering design process and outcomes of students in high schools that use the PLTW and EbDTM curricula in Idaho. A case-study design was used. Data were collected in two stages. In the first stage, a content analysis was conducted for PLTW and EbDTM curricula to identify the indicators that are associated with the six constructs articulated by Childress and Rhodes (2008). In the second stage, the constructs and key indicators or concepts were placed on a survey and sent to experts for them to rate their importance for assessment and their difficulty to assess. Main findings included engineering and human values and the application of engineering design being ranked as first and second, respectively, for inclusion on an instrument to measure the engineering design process and outcomes. In addition, a total of 141 indicators were identified for all constructs. The indicators identified provide a useful list of measures that can be used by technology and engineering teachers. Selected indicators can be used by math, science, technology, and engineering education teachers as they coordinate in the teaching of STEM concepts and collaborate in the designing of project-based activities that they engage students in solving. Keywords: assessment; EbDTM; engineering design process; PLTW; problem- based learning; project-based learning; STEM.
    [Show full text]