First Steps in Relative Quantification Analysis of Multi-Plate Gene

Total Page:16

File Type:pdf, Size:1020Kb

First Steps in Relative Quantification Analysis of Multi-Plate Gene RealTime ready Application Note June 2011 First Steps in Relative Abstract/Introduction Quantification Analysis This technical application note describes the first steps in data analysis for relative quantification of gene expression of Multi-Plate Gene experiments performed using RealTime ready Custom Panels. The easy initial setup as well as the “Result Export” Expression Experiments procedure of the LightCycler® 480 Software is described. Further preparation of the data depends on the means of analysis. Three routes to analyze the data are described: 1. Spreadsheet calculation software 2. GenEx Software from MultiD Analysis Heiko Walch and Irene Labaere 3. qbasePLUS 2.0 from Biogazelle Roche Applied Science, Penzberg, Germany Whereas GenEx and qbasePLUS are software packages from third party vendors specialized in RT-qPCR data analysis (1, 2) the spreadsheet calculation approach can be seen as a generic procedure for performing subsequent analysis either in the spreadsheet software or in other sophisticated statistical tools such as R, SigmaPlot, etc. (3, 4) Note that this article does not focus on qPCR setup, experimental planning or correct selection of reference genes. For details on this important topics, see publications such as the MIQE guidelines (5), or other recently published scientific best practices for qPCR expression studies (6, 7, 8). For life science research only. Not for use in diagnostic procedures. RealTime ready Setting Up the Experiment and Selecting the Assays A thoroughly planned experiment is vital to good scientific The gene list was generated using the focus list of assays for data. The RealTime ready Configurator (9) offers various genes “Amplified/overexpressed genes in cancers”, based on a possibilities to facilitate the creation of a reasonable list of publication from Santarius et al. (10). Additionally, some target genes. For example, the focus lists available under the genes from the NF-kB pathway were added via the “Search “Search by Focus Panel” function comprise RealTime ready by Pathway” function in the RealTime ready Configurator. assay collections for different fields of interest, such as: • Genes involved in various signaling pathways RealTime ready Custom Panel 384 – 96 (e.g., NF-kB, Jun, MAPK, sonic hedgehog, Notch, wnt) • Genes involved in specific biological processes (e.g., angiogenesis, oncology, apoptosis, induction of pluripotency, stem cell differentiation) • Genes belonging to certain families or groupings (e.g., protein kinases, phosphatases, proteases, transcription factors) These lists can be seen as scientifically sound suggestions, and the RealTime ready Configurator also offers the freedom to add and remove assays to the desired layout as needed. To demonstrate the first analytical steps in a multi-plate analysis, we set up an artificial time series experiment with two biological samples (S01 and S02) and user selected Replicates: 4 Reference two time points (T01 and T02) after treatment measured user selected Assays: 93 Genes: in duplicates for the cDNA synthesis step and a single qPCR user selected measurement each. With a total of 93 genes of interest Figure 1: Sample layout for a 384-well plate with 4 replicates of and 3 reference genes, we chose a 384-well plate layout with 93 targets of interest and 3 reference genes each. 4 replicates on each plate (see Figure 1). Performing the RealTime ready Experiment on the LightCycler® 480 Instrument RealTime ready Custom Panels can be customized on the the page contains a .zip file that can be downloaded and is RealTime ready Configurator (9). The selected assays are associated with the configuration. The archive contains two pre-plated, and ready to use on LightCycler® 480 Multiwell text files: Plates. Single assays are also available in liquid format. The 1. The sample editor import file with layout information “My Orders” page of the Configurator shows the current (see Figure 2) status of all orders. After an order is produced and shipped 2. The configuration info file with additional assay information (see Figure 3) General : Pos General : Target Name Rel Quant : Combined Sample/Target Type A1 HS|PRKCI|105941 Target Unknown Figure 2: An excerpt of the sample editor import file. The three columns contain information for the well coordinates (General:Pos), the contained A2 HS|AKT3|105933 Target Unknown assays (General:Target Name), and the type of the assay (Rel Quant: … … … Combined Sample/Target Type). The assay information is represented as a concatenation of the organism, the gene symbol, and the assay ID, all N5 HS|NFKB2|100649 Target Unknown separated by vertical dashes (“|”). The different types of assays are either a gene of interest (Target Unknown), a reference gene (Ref Unknown), or N6 HS|ALAS1|102108 Ref Unknown an assay from the control concept (Unassigned Unknown) not used for this demonstration. The file can be imported directly into the LightCycler® N7 HS|TRAF2|102966 Target Unknown 480 sample editor software or opened and modified in any text editor (for example, to change the reference genes or gene annotations) and then … … … reimported afterwards. 2 RealTime ready: First Steps in Relative Quantification Analysis of Multi-Plate Gene Expression Experiments Figure 3: Configuration info file containing assay annotation and Combined with the detailed assay information and annotations sequence information. The example configuration info file shown above available at the RealTime ready Configurator, all information required contains additional information regarding the plate layout, gene information, by the MIQE guidelines for scientific publications is provided (5). and assay details including the primer sequences. The contents of the archive greatly reduce the time needed The information on how the replicates on the plate are used to annotate and set up the experiments in the is plate-specific and also needs to be edited in the sample LightCycler® 480 Software. After creating a new experiment editor. It is essential to use consistent naming for samples, in the LightCycler® 480 Software (11), the provided file replicates, and any additional information that will be needed can be imported into the sample editor to assign all well and later to combine the data. assay information for this configuration (see Figure 4). Figure 4: Editing target and sample information using the sample editor import file in the sample editor of the LightCycler® 480 Software. Preparing and Exporting the Result Data from the LightCycler® 480 Software 1.5 The LightCycler® 480 Software offers several analysis The Cq values are generated using the “Abs. Quant Analysis” modules. For this example we want to evaluate the relative and the “automated second derivative” method (11). Finally expression levels of mRNAs in different biological samples. to do a meaningful comparison of different samples and To do this we compare the normalized Cq values which genes the Cq values need to be annotated with sample and can be used as a relative measurement for the abundance of gene information accordingly as demonstrated below. cDNA covered by the particular RealTime ready assay. RealTime ready: First Steps in Relative Quantification Analysis of Multi-Plate Gene Expression Experiments 3 Exporting the Result Data from the LightCycler® 480 Software In order to combine the results of multiple plates with tested with a standard cDNA sample, therefore missing values varying numbers of replicates, use absolute quantification can be interpreted as no expression of the relevant target in conjunction with the second derivative method for or, to be more precise, an expression beyond the level of Cq-calling. The actual relative quantification is done detection for that particular assay. For data analysis purposes, outside the LightCycler® Software, following standard those missing values can be dealt with in various ways, some RT-qPCR analysis procedures (11). Looking at the analysis of which will be explained briefly in the analysis sections. results, some well positions might be highlighted in green, If, however, there are any clear patterns visible, for example, indicating that no Cq value could be calculated. Such one single plate replicate or one column/row shows no calls “failures” may occur for many reasons and can be expected you should check the starting materials and the pipetting if large amounts of different assays are used with schemas for possible technical errors. uncharacterized samples. RealTime ready assays are function Figure 5: Export of the result Cq values in the LightCycler® 480 Software by right-clicking in the result table and saving the file to an appropriate location. After the “Abs. Quant. Analysis”, the result data needs to be exported by right-clicking in the result table in the analysis section of the LightCycler® 480 Software for every individual run (see Figure 5). For the sake of data quality, consider how and where to organize the data. In this context, having a dedicated folder structure, with for example, file names and/ or short experimental descriptions along with the result Figure 6: File and folder structure for Experiment. and IXO files names for a particular round of experiments, 1. Short experiment description; is a good start (see Figure 6). For this workflow example, 2. Sample editor import file; the experiment consisted of 3 LightCycler® 480 Instrument 3. Zip file provided by Roche containing the sample editor import file and runs saved as “result_CONFIG_ID_plateXYZ.txt”. assay annotation/information; 4 & 5. Exported result files for the individual plates. After exporting the results, the run and data gathering step is finished and the data needs to be cleaned and combined in order to be analyzed. 4 RealTime ready: First Steps in Relative Quantification Analysis of Multi-Plate Gene Expression Experiments General Considerations for Relative Quantification Expression Data Multi-plate multi-parameter analysis can be conducted in several ways; this note covers only the first steps of cleaning and combining the results. These steps are usually constant and independent of the level of sophistication followed in the statistical analysis afterwards. For a “fold change analysis” these steps are: 1.
Recommended publications
  • Quantification of Damages , in 3 ISSUES in COMPETITION LAW and POLICY 2331 (ABA Section of Antitrust Law 2008)
    Theon van Dijk & Frank Verboven, Quantification of Damages , in 3 ISSUES IN COMPETITION LAW AND POLICY 2331 (ABA Section of Antitrust Law 2008) Chapter 93 _________________________ QUANTIFICATION OF DAMAGES Theon van Dijk and Frank Verboven * This chapter provides a general economic framework for quantifying damages or lost profits in price-fixing cases. We begin with a conceptual framework that explicitly distinguishes between the direct cost effect of a price overcharge and the subsequent indirect pass-on and output effects of the overcharge. We then discuss alternative methods to quantify these various components of damages, ranging from largely empirical approaches to more theory-based approaches. Finally, we discuss the law and economics of price-fixing damages in the United States and Europe, with the focus being the legal treatment of the various components of price-fixing damages, as well as the legal standing of the various groups harmed by collusion. 1. Introduction Economic damages can be caused by various types of competition law violations, such as price-fixing or market-dividing agreements, exclusionary practices, or predatory pricing by a dominant firm. Depending on the violation, there are different parties that may suffer damages, including customers (direct purchasers and possibly indirect purchasers located downstream in the distribution chain), suppliers, and competitors. Even though the parties damaged and the amount of the damage may be clear from an economic perspective, in the end it is the legal framework within a jurisdiction that determines which violations can lead to compensation, which parties have standing to sue for damages, and the burden and standard of proof that must be met by plaintiffs and defendants.
    [Show full text]
  • Defining Data Science by a Data-Driven Quantification of the Community
    machine learning & knowledge extraction Article Defining Data Science by a Data-Driven Quantification of the Community Frank Emmert-Streib 1,2,* and Matthias Dehmer 3,4,5 1 Predictive Medicine and Data Analytics Lab, Department of Signal Processing, Tampere University of Technology, FI-33101 Tampere, Finland 2 Institute of Biosciences and Medical Technology, FI-33101 Tampere, Finland 3 Institute for Intelligent Production, Faculty for Management, University of Applied Sciences Upper Austria, Steyr Campus, A-4400 Steyr, Austria; [email protected] 4 Department of Mechatronics and Biomedical Computer Science, UMIT, A-6060 Hall in Tyrol, Austria 5 College of Computer and Control Engineering, Nankai University, Tianjin 300071, China * Correspondence: [email protected]; Tel.: +358-50-301-5353 Received: 4 December 2018; Accepted: 17 December 2018; Published: 19 December 2018 Abstract: Data science is a new academic field that has received much attention in recent years. One reason for this is that our increasingly digitalized society generates more and more data in all areas of our lives and science and we are desperately seeking for solutions to deal with this problem. In this paper, we investigate the academic roots of data science. We are using data of scientists and their citations from Google Scholar, who have an interest in data science, to perform a quantitative analysis of the data science community. Furthermore, for decomposing the data science community into its major defining factors corresponding to the most important research fields, we introduce a statistical regression model that is fully automatic and robust with respect to a subsampling of the data.
    [Show full text]
  • Workshop on Quantification, Communication, and Interpretation of Uncertainty in Simulation and Data Science
    Workshop on Quantification, Communication, and Interpretation of Uncertainty in Simulation and Data Science 1 Workshop on Quantification, Communication, and Interpretation of Uncertainty in Simulation and Data Science Ross Whitaker, William Thompson, James Berger, Baruch Fischhof, Michael Goodchild, Mary Hegarty, Christopher Jermaine, Kathryn S. McKinley, Alex Pang, Joanne Wendelberger Sponsored by This material is based upon work supported by the National Science Foundation under Grant No. (1136993). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. Abstract .........................................................................................................................................................................2 Workshop Participants .................................................................................................................................................2 Executive Summary .......................................................................................................................................................3 Overview ........................................................................................................................................................................3 Actionable Recommendations ......................................................................................................................................3 1
    [Show full text]
  • Quantification Strategies in Real-Time PCR Michael W. Pfaffl
    A-Z of quantitative PCR (Editor: SA Bustin) Chapter3. Quantification strategies in real-time PCR 87 Quantification strategies in real-time PCR Michael W. Pfaffl Chaper 3 pages 87 - 112 in: A-Z of quantitative PCR (Editor: S.A. Bustin) International University Line (IUL) La Jolla, CA, USA publication year 2004 Physiology - Weihenstephan, Technical University of Munich, Center of Life and Food Science Weihenstephan, Freising, Germany [email protected] Content of Chapter 3: Quantification strategies in real-time PCR Abstract 88 3.1. Introduction 88 3.2. Markers of a Successful Real-Time RT-PCR Assay 88 3.2.1. RNA Extraction 88 3.2.2. Reverse Transcription 90 3.2.3. Comparison of qRT-PCR with Classical End-Point Detection Method 91 3.2.4. Chemistry Developments for Real-Time RT-PCR 92 3.2.5. Real-Time RT-PCR Platforms 92 3.2.6. Quantification Strategies in Kinetic RT-PCR 92 3.2.6.1. Absolute Quantification 93 3.2.6.2. Relative Quantification 95 3.2.8. Real-Time PCR Amplification Efficiency 99 3.2.9. Data Evaluation 101 3.3. Automation of the Quantification Procedure 102 3.4. Normalization 103 3.5. Statistical Comparison 106 3.6. Conclusion 107 3.7 References 108 A-Z of quantitative PCR (Editor: SA Bustin) Chapter3. Quantification strategies in real-time PCR 88 Abstract This chapter analyzes the quantification strategies in real-time RT-PCR and all corresponding markers of a successful real-time RT-PCR. The following aspects are describes in detail: RNA extraction, reverse transcription (RT), and general quantification strategies—absolute vs.
    [Show full text]
  • Sociology and Science: the Making of a Social Scientific Method
    Am Soc DOI 10.1007/s12108-017-9348-y Sociology and Science: The Making of a Social Scientific Method Anson Au1 # The Author(s) 2017. This article is an open access publication Abstract Criticism against quantitative methods has grown in the context of Bbig- data^, charging an empirical, quantitative agenda with expanding to displace qualitative and theoretical approaches indispensable to the future of sociological research. Underscoring the strong convergences between the historical development of empiri- cism in the scientific method and the apparent turn to quantitative empiricism in sociology, this article uses content and hierarchical clustering analyses on the textual representations of journal articles from 1950 to 2010 to open dialogue on the episte- mological issues of contemporary sociological research. In doing so, I push towards the conceptualization of a social scientific method, inspired by the scientific method from the philosophy of science and borne out of growing constructions of a systematically empirical representation among sociology articles. I articulate how this social scientific method is defined by three dimensions – empiricism, and theoretical and discursive compartmentalization –, and how, contrary to popular expectations, knowledge pro- duction consequently becomes independent of choice of research method, bound up instead in social constructions that divide its epistemological occurrence into two levels: (i) the way in which social reality is broken down into data, collected and analyzed, and (ii) the way in which this data is framed and made to recursively influence future sociological knowledge production. In this way, empiricism both mediates and is mediated by knowledge production not through the direct manipulation of method or theory use, but by redefining the ways in which methods are being labeled and knowledge framed, remembered, and interpreted.
    [Show full text]
  • The History of Quantification and Objectivity in the Social Sciences
    Social Cosmos - URN:NBN:NL:UI:10-1-116041 The history of quantification and objectivity in the social sciences Eline N. van Basten Clinical and Health Psychology Abstract How can we account for the triumph of quantitative methodology in contemporary social sciences? This article reviews several historical works on the use of quantification in the pursuit of objectivity. During the eighteenth and nineteenth centuries, objectivity was increasingly valued due to the absence of an elite culture, increasing anonymity, and the rise of pseudosciences. Before and during World War I, financial and governmental pressures intensified the movement toward objectivity and quantification. Quantification became almost mandatory as a response to World War II and Cold War conditions of mistrust and disunity. In the next decades, research findings started to circulate across oceans and continents and quantification served international communication very well. During the 1980s, the discussed challenges were no longer evident and the 1990s were characterized by continuous argument over the arbitrariness of quantitative decision making. Since then, there have been few reform, and roughly the same criticism applies to current statistical use. One can conclude that quantification has served the demands of social scientists for transparency, neutrality, and communicability, but in order to advance, social scientists should re-evaluate their own critical and creative mind. Keywords: quantification, objectivity, statistics, transparency, neutrality, communicability Introduction Describing human phenomena in the language of numbers may be counterintuitive. Although none of the social sciences has truly internalized a quantitative paradigm (Smith, 1994), strict quantification came to be seen as the badge of scientific legitimacy (Porter, 1995).
    [Show full text]
  • Relative Quantification Getting Started Guide for the Applied Biosystems 7300/7500/7500 Fast Real-Time PCR System RQ Experiment Workflow
    Getting Started Guide Relative Quantification Introduction and Example Applied Biosystems 7300/7500/7500 Fast RQ Experiment Real-Time PCR System Designing an RQ Experiment Performing Primer Extended on mRNA 5′ 3′ 5′ cDNA Reverse Primer Oligo d(T) or random hexamer Reverse Synthesis of 1st cDNA strand 3′ 5′ cDNA Transcription Generating Data from STANDARD RQ Plates - Standard Generating Data from FAST RQ Plates - Fast System Generating Data in an RQ Study © Copyright 2004, Applied Biosystems. All rights reserved. For Research Use Only. Not for use in diagnostic procedures. Authorized Thermal Cycler This instrument is an Authorized Thermal Cycler. Its purchase price includes the up-front fee component of a license under United States Patent Nos. 4,683,195, 4,683,202 and 4,965,188, owned by Roche Molecular Systems, Inc., and under corresponding claims in patents outside the United States, owned by F. Hoffmann-La Roche Ltd, covering the Polymerase Chain Reaction ("PCR") process to practice the PCR process for internal research and development using this instrument. The running royalty component of that license may be purchased from Applied Biosystems or obtained by purchasing Authorized Reagents. This instrument is also an Authorized Thermal Cycler for use with applications licenses available from Applied Biosystems. Its use with Authorized Reagents also provides a limited PCR license in accordance with the label rights accompanying such reagents. Purchase of this product does not itself convey to the purchaser a complete license or right to perform the PCR process. Further information on purchasing licenses to practice the PCR process may be obtained by contacting the Director of Licensing at Applied Biosystems, 850 Lincoln Centre Drive, Foster City, California 94404.
    [Show full text]
  • Introduction to Quantification and Its Concerns
    Introduction to Quantification and Its Concerns Ma Jonathan Han Son 1. Introduction It took scientists centuries to figure out how the use of quantities could facilitate the development of science. Nowadays, advertisers, politicians and scholars always use quantities to convince the public that their claims really rest on something. Yet, it would be doubtful whether the numbers they use are solid enough to be that “something”. The answer lies on the process that turns concepts into quantities—quantification. In this essay, on top of introducing its history and purpose, I will raise some concerns about this powerful tool. 1.1 The definition of quantity A quantity is a property that exists as a multitude or magnitude. It could be discrete (the number of people could only be integers) or continuous (the intensity of a light bulb could vary in infinitely small steps).1 1.2 The idea of quantification Quantification is the mapping of a concept to numbers. These numbers would represent the quantity of the property associated with the concept. Take length as an example. At first, length was a subjective concept. 1 See “Quantity” in Wikipedia (Retrieved 07:30, April 27, 2011). 98 與自然對話 In Dialogue with Nature “Longer” things possess higher magnitudes of length. Although the concept (which may be expressed as a statement) “length” (“Longer” things possess higher magnitudes of length.) cannot be represented by a number, the property “length” of an object can be. Then the ruler was invented to provide a linear scale which quantifies the concept of length. Length thus became objective. After an international agreement on the definition of a standard ruler was reached, a standard unit (metre) of length became available.2 Since then, that “ruler” has been a definition of length that is internationally accepted.
    [Show full text]
  • Uncertainty Quantification and Global Sensitivity Analysis for Economic Models
    A Service of Leibniz-Informationszentrum econstor Wirtschaft Leibniz Information Centre Make Your Publications Visible. zbw for Economics Harenberg, Daniel; Marelli, Stefano; Sudret, Bruno; Winschel, Viktor Article Uncertainty quantification and global sensitivity analysis for economic models Quantitative Economics Provided in Cooperation with: The Econometric Society Suggested Citation: Harenberg, Daniel; Marelli, Stefano; Sudret, Bruno; Winschel, Viktor (2019) : Uncertainty quantification and global sensitivity analysis for economic models, Quantitative Economics, ISSN 1759-7331, Wiley, Hoboken, NJ, Vol. 10, Iss. 1, pp. 1-41, http://dx.doi.org/10.3982/QE866 This Version is available at: http://hdl.handle.net/10419/217135 Standard-Nutzungsbedingungen: Terms of use: Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Documents in EconStor may be saved and copied for your Zwecken und zum Privatgebrauch gespeichert und kopiert werden. personal and scholarly purposes. Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle You are not to copy documents for public or commercial Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich purposes, to exhibit the documents publicly, to make them machen, vertreiben oder anderweitig nutzen. publicly available on the internet, or to distribute or otherwise use the documents in public. Sofern die Verfasser die Dokumente unter Open-Content-Lizenzen (insbesondere CC-Lizenzen) zur Verfügung gestellt haben sollten, If the documents have been made available under an Open
    [Show full text]
  • A Quantification Technique for Measuring an Expected
    computers Article To Adapt or Not to Adapt: A Quantification Technique for Measuring an Expected Degree of Self-Adaptation † Sven Tomforde * and Martin Goller Intelligent Systems, Christian-Albrechts-Universität zu Kiel, 24118 Kiel, Germany; [email protected] * Correspondence: [email protected]; Tel.: +49-431-880-4844 † This paper is an extended version of our paper published in the Workshop on Self-Aware Computing (SeAC), held in conjunction with Foundations and Applications of Self* Systems (FAS* 2019). Received: 14 February 2020; Accepted: 15 March 2020; Published: 18 March 2020 Abstract: Self-adaptation and self-organization (SASO) have been introduced to the management of technical systems as an attempt to improve robustness and administrability. In particular, both mechanisms adapt the system’s structure and behavior in response to dynamics of the environment and internal or external disturbances. By now, adaptivity has been considered to be fully desirable. This position paper argues that too much adaptation conflicts with goals such as stability and user acceptance. Consequently, a kind of situation-dependent degree of adaptation is desired, which defines the amount and severity of tolerated adaptations in certain situations. As a first step into this direction, this position paper presents a quantification approach for measuring the current adaptation behavior based on generative, probabilistic models. The behavior of this method is analyzed in terms of three application scenarios: urban traffic control, the swidden farming model, and data communication protocols. Furthermore, we define a research roadmap in terms of six challenges for an overall measurement framework for SASO systems. Keywords: self-adaptation; quantification; system analysis; organic computing; metric; degree of adaptation 1.
    [Show full text]
  • Two Dogmas of Empiricism1a
    Two Dogmas of Empiricism1a Willard Van Orman Quine Originally published in The Philosophical Review 60 (1951): 20-43. Reprinted in W.V.O. Quine, From a Logical Point of View (Harvard University Press, 1953; second, revised, edition 1961), with the following alterations: "The version printed here diverges from the original in footnotes and in other minor respects: §§1 and 6 have been abridged where they encroach on the preceding essay ["On What There Is"], and §§3-4 have been expanded at points." Except for minor changes, additions and deletions are indicated in interspersed tables. I wish to thank Torstein Lindaas for bringing to my attention the need to distinguish more carefully the 1951 and the 1961 versions. Endnotes ending with an "a" are in the 1951 version; "b" in the 1961 version. (Andrew Chrucky, Feb. 15, 2000) Modern empiricism has been conditioned in large part by two dogmas. One is a belief in some fundamental cleavage between truths which are analytic, or grounded in meanings independently of matters of fact and truths which are synthetic, or grounded in fact. The other dogma is reductionism: the belief that each meaningful statement is equivalent to some logical construct upon terms which refer to immediate experience. Both dogmas, I shall argue, are ill founded. One effect of abandoning them is, as we shall see, a blurring of the supposed boundary between speculative metaphysics and natural science. Another effect is a shift toward pragmatism. 1. BACKGROUND FOR ANALYTICITY Kant's cleavage between analytic and synthetic truths was foreshadowed in Hume's distinction between relations of ideas and matters of fact, and in Leibniz's distinction between truths of reason and truths of fact.
    [Show full text]
  • The Role of Quantification in Qualitative Research in Education
    ~"i~HiJf~~¥~ 1993, ~J\~, 19-27 Educational Research Journal 1993, Vol. 8, pp.19-27 The Role of Quantification in Qualitative Research in Education TAM Tim-kui, Peter University of Hong Kong The term "qualitative research" is used by researchers with different understanding and is not represent­ ing one single approach, it stands for a variety of methods including ethnography, educational connoisseurship and criticism, naturalistic inquiry, vignette analysis, case study, analysis of ecological specimen records, and so on. In this paper, different types of qualitative research are classified on the basis of epistemologies and approaches. It is argued that some inquiries are truly qualitative, but some are not. Furthermore, it is also argued that, at the level of epistemologies, one should combine interpretivism and positivism in looking at educational problems. At the level of procedures, quantifying qualitative informa­ tion can make data analysis more efficient and manageable. Modem-day ethnographers should also be well­ trained in certain areas in quantitative methods, particularly in research designs and non-parametric statis­ tics. However, it is important to observe that in the process of quantification, the interpretive stance and the subjective elements of the qualitative information are not distorted nor eliminated. Otherwise, the qualitative inquiry will be "engulfed" by the quantitative paradigm. rw~m~J~~~~~-m¥-m~~~~~M·®~~~-m%M~m~~~~~~·~~:~•tt·~· EB&~-·~~~~~~~·~~~~~·m•m~·~D~*~fi&~~~··o*~~~~--&~~-* •~&M•%•~•~&•~m~~~•·~mww~m~~~~•~•~~·oo•~m~~-~•m~••*o *~#mW·~~--~~*~·~•~•&•m~•~u~B·~~~•••~~-~ 0 ~~~~-*~·•~ m~~-~m~#~U~~~~om4~-~~·-~m~ft&S~-~-~~~~~~-·~X~~HW~H~ ~:g~fH~Y:;f!E~ts9:;;oa o {£1.o/Z~a• • 'ltfltfr~t-Jit1t~~-Jfl:JnW1ijtf-l-ffi':f • ~~~1iti!!.t~::Wli~f-l-s9~!lilmm &~ ••*m~·N~·•~m~M*~*X~~~~-·~ffi':f#•••~m~~rfimJ 0 As reflected in the cuiTent literature, qualita­ research, i.e., to what extent can qualitative re­ tive research in education has become increasingly search employ quantitative concepts and proce­ popular.
    [Show full text]