Integration of a Web-Based Rating System with an Oral Proficiency Interview Test: Argument-Based Approach to Validation Hye Jin Yang Iowa State University

Total Page:16

File Type:pdf, Size:1020Kb

Integration of a Web-Based Rating System with an Oral Proficiency Interview Test: Argument-Based Approach to Validation Hye Jin Yang Iowa State University Iowa State University Capstones, Theses and Graduate Theses and Dissertations Dissertations 2016 Integration of a web-based rating system with an oral proficiency interview test: argument-based approach to validation Hye Jin Yang Iowa State University Follow this and additional works at: https://lib.dr.iastate.edu/etd Part of the Bilingual, Multilingual, and Multicultural Education Commons, English Language and Literature Commons, and the Linguistics Commons Recommended Citation Yang, Hye Jin, "Integration of a web-based rating system with an oral proficiency interview test: argument-based approach to validation" (2016). Graduate Theses and Dissertations. 15189. https://lib.dr.iastate.edu/etd/15189 This Dissertation is brought to you for free and open access by the Iowa State University Capstones, Theses and Dissertations at Iowa State University Digital Repository. It has been accepted for inclusion in Graduate Theses and Dissertations by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. Integration of a web-based rating system with an oral proficiency interview test: Argument-based approach to validation by Hye Jin Yang A dissertation submitted to the graduate faculty in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Major: Applied Linguistics and Technology Program of Study Committee: Carol A. Chapelle, Co-Major Professor Elena Cotos, Co-Major Professor Volker Hegelheimer Gary Ockey Frederick O. Lorenz Jo Mackiewicz Iowa State University Ames, Iowa 2016 Copyright © Hye Jin Yang, 2016. All rights reserved. ii I dedicate this dissertation to my mom and dad Ok Chun Hwang & Seung Woo Yang iii TABLE OF CONTENTS LIST OF FIGURES ------------------------------------------------------------------------------------- vi LIST OF TABLES ------------------------------------------------------------------------------------ viii ACKNOWLEDGMENTS ------------------------------------------------------------------------------ xi ABSTRACT ------------------------------------------------------------------------------------------ xixiii CHAPTER 1 INTRODUCTION ---------------------------------------------------------------------- 1 1.1 Background ---------------------------------------------------------------------------------------- 1 1.2 Problems in the Context of This Dissertation ------------------------------------------------- 6 1.3 Purpose of This Dissertation -------------------------------------------------------------------- 7 1.4 Significance of The Study ----------------------------------------------------------------------- 9 1.5 Dissertation Overview -------------------------------------------------------------------------- 10 CHAPTER 2 THEORETICAL AND EMPIRICAL FOUNDATION --------------------------- 12 2.1 Issues in Computer-Assisted Language Testing --------------------------------------------- 12 2.1.1 History and key attributes of CALT ----------------------------------------------------- 12 2.1.2 CALT types --------------------------------------------------------------------------------- 15 2.1.3 Construct validation studies in CALT --------------------------------------------------- 20 2.1.3.1 Correlation studies -------------------------------------------------------------------- 21 2.1.3.2 Comparison of examinees’ performance between CBT and PBT -------------- 22 2.1.3.3 Sources of construct-irrelevant variance relevant to computers ----------------- 23 2.1.4 Gaps in research ---------------------------------------------------------------------------- 26 2.2 Assessing Speaking Ability in Language Performance Tests ----------------------------- 27 2.2.1 Conceptualization of the speaking performance test process ------------------------- 28 2.2.2 Rater variability ----------------------------------------------------------------------------- 31 2.2.2.1 Rater severity -------------------------------------------------------------------------- 32 2.2.2.2 Rating scale use ------------------------------------------------------------------------ 35 2.2.3 Task variability ----------------------------------------------------------------------------- 36 2.2.3.1 Task difficulty factors ----------------------------------------------------------------- 37 2.2.3.2 Empirical methods of estimating task difficulty ---------------------------------- 39 2.2.4 Gaps in research ---------------------------------------------------------------------------- 41 CHAPTER 3 CONTEXT AND ARGUMENT-BASED APPROACH TO VALIDATION - 42 3.1 Context -------------------------------------------------------------------------------------------- 42 3.1.1 Components of the OECT ----------------------------------------------------------------- 43 3.1.2 Rating procedure --------------------------------------------------------------------------- 44 3.2 Rater-Platform (R-Plat) ------------------------------------------------------------------------- 48 3.3 Interpretive Argument for the OPI scores ---------------------------------------------------- 57 3.4 Research Questions ----------------------------------------------------------------------------- 66 iv CHAPTER 4 METHODOLOGY -------------------------------------------------------------------- 68 4.1 Research Design --------------------------------------------------------------------------------- 68 4.2 Participants --------------------------------------------------------------------------------------- 69 4.3 Materials ------------------------------------------------------------------------------------------ 71 4.3.1 Questionnaire--------------------------------------------------------------------------------71 4.3.2 Focus group and individual interviews--------------------------------------------------72 4.3.3 OPI prompts---------------------------------------------------------------------------------72 4.3.4 Scoring rubric ----------------------------------------------------------------------------- 74 4.3.5 Diagnostic descriptors --------------------------------------------------------------------- 75 4.4 Procedure ----------------------------------------------------------------------------------------- 76 4.4.1 Questionnaire ------------------------------------------------------------------------------- 77 4.4.2 Focus group and individual interviews -------------------------------------------------- 78 4.4.3 OPI prompt rotation ------------------------------------------------------------------------ 78 4.4.4 Rating results -------------------------------------------------------------------------------- 81 4.5 Data Analysis ------------------------------------------------------------------------------------ 84 4.5.1 Raters’ perceptions towards R-Plat (RQ1) ---------------------------------------------- 85 4.5.2 Comparisons of diagnostic descriptor markings (RQ2) ------------------------------- 87 4.5.3 Raters’ comments as indicators of speaking ability levels (RQ3) ------------------- 88 4.5.4 Descriptive statistics (RQ4) --------------------------------------------------------------- 93 4.5.5 OPI ratings as reliable indicators of different speaking ability levels (RQ5) ------- 94 4.5.6 Consistency of OPI prompts at different difficulty levels (RQ6) -------------------- 97 4.5.7 Consistency of raters within each administration (RQ7) ------------------------------ 99 CHAPTER 5 RESULTS ----------------------------------------------------------------------------- 101 5.1 Raters’ Perceptions towards R-Plat --------------------------------------------------------- 101 5.1.1 Clarity of R-Plat -------------------------------------------------------------------------- 102 5.1.2 Level of raters’ comfort with R-Plat --------------------------------------------------- 107 5.1.3 Effectiveness of R-Plat and raters’ satisfaction -------------------------------------- 112 5.2 Use of Diagnostic Descriptors to Support Proficiency Level Ratings ------------------ 116 5.2.1 Proficiency level comparisons of diagnostic descriptor markings ----------------- 116 5.2.2 Seven categories of diagnostic descriptors -------------------------------------------- 119 5.2.3 Raters’ reasons for selecting diagnostic descriptors --------------------------------- 135 5.3 Use of Raters’ Comments to Support Proficiency Level Ratings ----------------------- 137 5.3.1 Inter-coder reliability -------------------------------------------------------------------- 138 5.3.2 Comparison of positive and negative comments across proficiency levels ------- 138 5.3.3 Comparison of positive and negative evaluative units grouped by the OPI scoring criteria across proficiency levels ---------------------------------------------- 141 5.4. Descriptive Statistics ------------------------------------------------------------------------- 162 5.5 Dependability of OPI Ratings --------------------------------------------------------------- 165 5.5.1 Descriptive statistics for OPI ratings --------------------------------------------------- 165 5.5.2 Unidimensionality assumption check -------------------------------------------------- 168 5.5.3 Dependability of the OPI ratings ------------------------------------------------------- 172 5.6 Comparison of Intended Prompt Level and Observed Difficulty ----------------------- 172 5.6.1 Prompt difficulty at the advanced level ----------------------------------------------- 173 v 5.6.2 Prompt difficulty at the intermediate-high level ------------------------------------- 176 5.6.3 Prompt difficulty at the intermediate-mid
Recommended publications
  • Opções De Vagas
    Anexo - Vagas Disponíveis UNIVERSIDADE PAÍS VAGAS IDIOMA 1 PROFICIENCIA IDIOMA 1 OU IDIOMA 2 PROFICIÊNCIA IDIOMA 2 Alemão: Atestado ou Exame de proficiência no Inglês: Atestado ou Exame de Eberhard Karls Universität Tübingen Alemanha 5 Alemão OU Inglês mínimo nível B2 proficiência no mínimo nível B2 DSH 2, DSH 3, TestDAF (com 4 ou 5 pontos em todas as áreas, no mínimo 16 pontos) ou equivalente; German Ernst-Abbe-Fachhochschule Jena Alemanha 5 Alemão language proficiency: upper intermediate (e.g. Goethe certificate B2, TestDaF 3-4) Inglês: Atestado ou Exame de Hochschule Neu-Ulm Alemanha 3 Alemão Atestado ou Exame de proficiência no mínimo nível B2 OU Inglês proficiência no mínimo nível B2 Alemão: Atestado ou Exame de proficiência no Inglês: Atestado ou Exame de Hochschule Ruhr West Alemanha 5 Alemão OU Inglês mínimo nível B1 proficiência no mínimo nível B1 Alemão: Atestado ou Exame de proficiência no Inglês: Atestado ou Exame de Hochschule Worms Alemanha 3 Alemão OU Inglês mínimo nível B1 proficiência no mínimo nível B2 DSH 2, DSH 3, TestDaF 4 em todas as habilidades, Inglês: Atestado ou Exame de Westfälische Wilhelms-Universität Münster Alemanha 2 Alemão Goethe-Zertifikat C2, UNIcert-certificate III and IV ou OU Inglês Atestado ou Exame de proficiência no mínimo nível B2 proficiência no mínimo nível B2 Alemão: Atestado ou Exame de proficiência no Inglês: Atestado ou Exame de FHWien der WKW Áustria 1 Alemão OU Inglês mínimo B2 proficiência no mínimo nível B2 Francês: Atestado ou Exame de proficiência no Inglês: Atestado ou Exame de Ecole Pratique des Hautes Etudes Commerciales Bélgica 2 Francês OU Inglês mínimo nível B2 proficiência no mínimo nível B2 Francês: Atestado ou Exame de proficiência no Inglês: Atestado ou Exame de Université Libre de Bruxelles Bélgica 2 Francês OU Inglês mínimo nível B2 proficiência no mínimo nível B2 TOEFL (Test of English as a Foreign Language) iBT (internet-based test) score of 80 or PBT (paper-based test) score of 550.
    [Show full text]
  • English Language Testing in US Colleges and Universities
    DOCUMENT RESUME ED 331 284 FL 019 122 AUTHOR Douglas, Dan, Ed. TITLE English Language Testing in U.S. Colleges and Universities. INSTITUTION National Association for Foreign Student Affairs, Washington, D.C. SPONS AGENCY United States information Agency, Washington, DC. Advisory, Teaching, and Specialized Programs Div. REPORT NO /SBN-0-912207-56-6 PUB DATE 90 NOTE 106p. PUB TYPE Collected Works - General (020) EDRS PRICE MF01/PC05 Plus Postage. DESCRIPTORS Admission Criteria; College Admission; *English (Second Language); Foreign Countries; *Foreign Students; Higher Education; *Language Tests; Questionnaires; Second Language Programs; Standardized Tests; Surveys; *Teaching Assistants; *Test Interpretation; Test Results; Writing (Composition); Writing Tests IDENTIFIERS United Kingdom ABSTRACT A collection of essays and research reports addresses issues in the testing of English as a Second Language (ESL) among foreign students in United States colleges and universities. They include the following: "Overview of ESL Testing" (Ralph Pat Barrett); "English Language Testing: The View from the Admissions Office" (G. James Haas); "English Language Testing: The View from the English Teaching Program" (Paul J. Angelis); "Standardized ESL Tests Used in U.S. Colleges and Universities" (Harold S. Madsen); "British Tests of English as a Foreign Language" (J. Charles Alderson); "ESL Composition Testing" (Jane Hughey); "The Testing and Evaluation of International Teaching Assistants" (Barbara S. Plakans, Roberta G. Abraham); and "Interpreting Test Scores" (Grant Henning). Appended materials include addresses for use in obtaining information about English language testing, and the questionnaire used in a survey of higher education institutions, reported in one of the articles. (ME) *********************************************************************** Reproductions supplied by EDRS are the best that can be made from the original document.
    [Show full text]
  • Exploring Language Frameworks
    Exploring Language Frameworks Proceedings of the ALTE Kraków Conference, July 2011 For a complete list of titles please visit: http://www.cambridge.org/elt/silt Also in this series: Experimenting with Uncertainty: Essays in Assessing Academic English: Testing honour of Alan Davies English profi ciency 1950–1989 – the IELTS Edited by C. Elder, A. Brown, E. Grove, K. Hill, solution N. Iwashita, T. Lumley, T. McNamara, Alan Davies K. O’Loughlin Impact Theory and Practice: Studies of the An Empirical Investigation of the IELTS test and Progetto Lingue 2000 Componentiality of L2 Reading in English for Roger Hawkey Academic Purposes IELTS Washback in Context: Preparation for Edited by Cyril J. Weir, Yang Huizhong, Jin Yan academic writing in higher education The Equivalence of Direct and Semi- direct Anthony Green Speaking Tests Examining Writing: Research and practice in Kieran O’Loughlin assessing second language writing A Qualitative Approach to the Validation of Stuart D. Shaw and Cyril J. Weir Oral Language Tests Multilingualism and Assessment: Achieving Anne Lazaraton transparency, assuring quality, sustaining Continuity and Innovation: Revising the diversity – Proceedings of the ALTE Berlin Cambridge Profi ciency in English Examination Conference, May 2005 1913–2002 Edited by Lynda Taylor and Cyril J. Weir Edited by Cyril J. Weir and Michael Milanovic Examining FCE and CAE: Key issues and A Modular Approach to Testing English recurring themes in developing the First Language Skills: The development of the Certifi cate in English and Certifi cate in Certifi cates in English Language Skills (CELS) Advanced English exams examination Roger Hawkey Roger Hawkey Language Testing Matters: Investigating Issues in Testing Business English: The revision the wider social and educational impact of the Cambridge Business English Certifi cates of assessment – Proceedings of the ALTE Barry O’Sullivan Cambridge Conference, April 2008 European Language Testing in a Global Edited by Lynda Taylor and Cyril J.
    [Show full text]
  • English-As-A-Second-Language Programs for Matriculated Students in the United States: an Exploratory Survey and Some Issues
    Research Report ETS RR–14-11 English-as-a-Second-Language Programs for Matriculated Students in the United States: An Exploratory Survey and Some Issues Guangming Ling Mikyung Kim Wolf Yeonsuk Cho Yuan Wang December 2014 ETS Research Report Series EIGNOR EXECUTIVE EDITOR James Carlson Principal Psychometrician ASSOCIATE EDITORS Beata Beigman Klebanov Donald Powers Research Scientist ManagingPrincipalResearchScientist Heather Buzick Gautam Puhan Research Scientist Senior Psychometrician Brent Bridgeman John Sabatini Distinguished Presidential Appointee ManagingPrincipalResearchScientist Keelan Evanini Matthias von Davier Managing Research Scientist Senior Research Director Marna Golub-Smith Rebecca Zwick Principal Psychometrician Distinguished Presidential Appointee Shelby Haberman Distinguished Presidential Appointee PRODUCTION EDITORS Kim Fryer Ayleen Stellhorn Manager, Editing Services Editor Since its 1947 founding, ETS has conducted and disseminated scientific research to support its products and services, and to advance the measurement and education fields. In keeping with these goals, ETS is committed to making its research freely available to the professional community and to the general public. Published accounts of ETS research, including papers in the ETS Research Report series, undergo a formal peer-review process by ETS staff to ensure that they meet established scientific and professional standards. All such ETS-conducted peer reviews are in addition to any reviews that outside organizations may provide as part of their own publication processes. Peer review notwithstanding, the positions expressed in the ETS Research Report series and other published accounts of ETS research are those of the authors and not necessarily those of the Officers and Trustees of Educational Testing Service. The Daniel Eignor Editorship is named in honor of Dr.
    [Show full text]
  • Rater Expertise in a Second Language Speaking Assessment
    RATER EXPERTISE IN A SECOND LANGUAGE SPEAKING ASSESSMENT: THE INFLUENCE OF TRAINING AND EXPERIENCE A DISSERTATION SUBMITTED TO THE GRADUATE DIVISION OF THE UNIVERSITY OF HAWAIʻI AT MᾹNOA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY IN SECOND LANGUAGE STUDIES DECEMBER 2012 By Lawrence Edward Davis Dissertation Committee: John Norris, Chairperson James Dean Brown Thom Hudson Luca Onnis Thomas Hilgers © Copyright 2012 by Lawrence Edward Davis All Rights Reserved ii For Donna Seto Davis iii ACKNOWLEDGMENTS Many individuals made important contributions to my dissertation project. First, I thank my dissertation committee for many useful comments and suggestions made during the process. I particularly thank Professor Luca Onnis who introduced me to statistical learning and behavioral economics (domains which informed the final section of the study) and Professor J.D. Brown for providing publication opportunities outside of the dissertation process, and for his gentle corrections in the face of my many transgressions against APA style. In addition, John Davis, Hanbyul Jung, Aleksandra Malicka, Kyae- Sung Park, John Norris, and Veronika Timpe assisted in the piloting of instruments used in the study; their comments and observations did much to improve the data collection procedures. I am grateful to Xiaoming Xi at Educational Testing Service (ETS) for her help in obtaining access to the TOEFL iBT Public Use Dataset and to Pam Mollaun at ETS for her help in recruiting TOEFL scoring leaders to provide reference scores. I also thank the eleven ETS scoring leaders who provided additional scores for responses from the TOEFL iBT Public Use Dataset.
    [Show full text]
  • Research Notes 55
    Research Notes Issue 55 February 2014 ISSN 1756-509X Research Notes Issue 55 / February 2014 A quarterly publication reporting on learning, teaching and assessment Guest Editor Dr Hanan Khalifa, Head of Research and International Development, Cambridge English Language Assessment Senior Editor and Editor Dr Jayanti Banerjee, Research Director, Cambridge Michigan Language Assessments Coreen Docherty, Senior Research and Validation Manager, Cambridge English Language Assessment Editorial Board Cris Betts, Assistant Director, Cambridge English Language Assessment Natalie Nordby Chen, Director of Assessment, Cambridge Michigan Language Assessments Barbara Dobson, Assistant Director, Cambridge Michigan Language Assessments Dr Gad S Lim, Principal Research and Validation Manager, Cambridge English Language Assessment Production Team Rachel Rudge, Marketing Project Co-ordinator, Cambridge English Language Assessment John Savage, Publications Assistant, Cambridge English Language Assessment Printed in the United Kingdom by Canon Business Services CAMBRIDGE ENGLISH : RESEARCH NOTES : ISSUE 55 / FEBRUARY 2014 | 1 Research Notes Issue 55 February 2014 Contents Editorial 2 Safeguarding fairness principles through the test development process: A tale of two organisations 3 Katie Weyant and Amanda Chisholm Investigating grammatical knowledge at the advanced level 7 Fabiana MacMillan, Daniel Walter and Jessica O’Boyle A look into cross-text reading items: Purpose, development and performance 12 Fabiana MacMillan, Mark Chapman and Jill Rachele Stucker
    [Show full text]
  • International Qualifications for Entry to University Or College in 2015
    International qualifications For entry to university or college in 2015 This publication contains information relating to a wide range of international qualifications offered for undergraduate admission in the UK. Published by: UCAS Rosehill New Barn Lane Cheltenham GL52 3LZ © UCAS 2014 All rights reserved. UCAS is a registered trade mark. UCAS, a company limited by guarantee, is registered in England and Wales number: 2839815 Registered charity number: 1024741 (England and Wales) and SC038598 (Scotland) Publication reference: 130226 We have made all reasonable efforts to ensure that the information in this publication was correct at time of publication. We will not, however, accept any liability for errors, omissions or changes to information since publication. Wherever possible any changes will be updated on the UCAS website (www.ucas.com). Copies of this publication can be downloaded from www.ucas.com/how-it-all-works/advisers-and-referees/guides-and-resources For further information about the UCAS application process for HEPs go to www.ucas.com/members-providers If you need to contact the HEP Team: phone 01242 545734 or fax 01242 544 961 If you have hearing difficulties, you can contact the customer service team, using the text relay service: From the UK phone 18001 0371 468 0 468 If you have received exceptional service from someone at UCAS and want to acknowledge it, then we would like to hear from you- please email your comments to [email protected]. Further information can be found on our website www.ucas.com. Preface This guide is primarily intended as a working document for selectors and staff involved in admissions and associated activities in higher education providers (HEPs).
    [Show full text]
  • Transitioning from MELAB to MET Resources for Admissions and Professional Organizations
    Delivering quality English language assessments for individuals, businesses, and institutions since 1953 Transitioning from MELAB to MET Resources for Admissions and Professional Organizations Prove your English. Achieve your goals. Own your future. Successor to MELAB Who We Are Michigan Language Assessment is a not-for-profit The Michigan English Language Assessment Battery collaboration between the University of Michigan (MELAB) was a multilevel English language proficiency and Cambridge Assessment English, two institutions test used mainly for students applying to universities with long histories and leadership roles in the field of where the language of instruction was English and language assessment, language teaching, and applied for certification of English proficiency by various linguistics research. Our exams are developed to the professional licensing organizations. highest professional standards, backed by ongoing Michigan Language Assessment has retired the research and development, and recognized by more MELAB and offers in its place a more modern exam, than 1,300 colleges, organizations, and regulatory the Michigan English Test (MET). For example, the MET bodies worldwide. includes cross-text reading tasks—an important skill in academic and professional contexts—and more thorough writing and speaking sections. General Information The 4-skill MET (listening, reading, writing, and speaking) offers a complete assessment. • Multilevel test (CEFR* levels A2 to C1) measuring proficiency at levels appropriate for professional and
    [Show full text]
  • Norwegian-Russian Scholarship Scheme 2018-19
    Norwegian-Russian Scholarship Scheme 2018-19 Introduction This announcement of scholarships is based on the Agreement between the Norwegian Ministry of Education and Research and the Russian Ministry of Education and Science, dated 27th April 2010. Norwegian scholarship schemes are normally closely tied to institutional collaboration, preferably with strong linkage between cooperation in research and education. Since the Ministry of Education and Research cannot instruct the institutions to accept a candidate within a specific field, this scheme is based on pre-agreements with defined institutions. The institutions retain their right not to accept a specific candidate. The total amount of available student grants annually is 30: 15 within Norwegian language and literature, 15 within other subjects. The grants can be awarded to students and academic staff at Russian institutions of higher education. Admission of a student for a two-year programme implies that the student is counted as grant recipient for two years, reducing the number of available grants the following year1. As acceptance of individual students cannot be guaranteed, it is suggested that the Russian Ministry of Education and Science recommends at least 20 students for language studies and 20 students for other studies, from which the grant recipients can be selected. Grants are offered at the following institutions: Oslo and Akershus University College of Applied Sciences UiT The Arctic University of Norway University of Bergen Nord University University of Oslo Norwegian University of Science and Technology 1 Please note that this scholarship is granted to Russian students for the purpose of studying in Norway. The scholarship cannot fund exchange stays to other countries.
    [Show full text]
  • 2020-2021 Study Abroad Handbook
    NOTE!! Information regarding partner universitiesʼ individual information (GPA, Language Proficiency, Academic Fees, etc) will be updated on the Center for International Education (CIE) website “Study Abroad Program List” as needed. The “Program List” listed on this booklet is purely referential. Please be aware that the information may change for 2019 Study Abroad. Check the CIE website “Study Abroad Program List” for up-to-date information. Message from the Dean of CIE Hello! If youʼre a student viewing this page, you are probably considering studying abroad. If so, I strongly encourage you to do so! Studying in another country allows you to grow and learn in myriad ways. It is one of the most valuable forms of education, a cultural encounter that will remain an important part of you long after you have returned to your home country. There are obvious gains, frequently noted by those planning to study abroad, such as becoming more proficient in another language and becoming more independent and mentally strong by living apart from a comfortable everyday existence surrounded by friends and family in uncomplicated circumstances. These are certainly worthy aims, and in most cases, study abroad will enable this kind of development. But focusing just on benefits of this ilk is similar to beholding a gorgeous feast and thinking that it will be a good source of protein and fiber. The experience provides these, of course, but it is so much richer than just that. At the same time, it is hard to describe the essence of what it is reaped from study abroad. I think one analogy sums it up well.
    [Show full text]
  • INSTRUCTIONS for LICENSURE by ENDORSEMENT of NURSES EDUCATED OUTSIDE the UNITED STATES
    Oklahoma Board of Nursing 2915 N. Classen Boulevard Suite 524 Oklahoma City, OK 73106 (405) 962-1800 www.nursing.ok.gov INSTRUCTIONS for LICENSURE by ENDORSEMENT of NURSES EDUCATED OUTSIDE the UNITED STATES Application Fee - $125.00 (Endorsement $85.00 + Education Equivalence Evaluation $40.00) If a temporary license is requested – Add $10.00 Per 59 O.S. § 4100.8, if you are active duty military personnel or the spouse of an active duty military personnel who has received orders or notice for military transfer or honorable discharge to this state, there is NO fee required to submit this application. Please see the “Military Personnel” section for more information. Use this application if you: Were originally educated in and graduated from a nursing program in a country outside the United States and U.S. territories; and Were originally licensed in a country outside the United States and U.S. territories; and Have not previously held a license (at the same level) in Oklahoma; and Want to endorse your license into Oklahoma as a single-state license. Please note that an applicant educated in a U.S. territory not recognized as a full member of National Council of State Boards of Nursing (NCSBN) must meet the requirements for applicants educated in foreign countries. An applicant educated in a U.S. territory that is a full member of NCSBN but in a nursing education program not included on the NCSBN state-approved programs of nursing list at the time of the applicant’s graduation from the program must meet the requirements for applicants educated in foreign countries.
    [Show full text]
  • Caa 1 Eastern Illinois University the Graduate
    Agenda Item #08-76 Effective Summer 2009 EASTERN ILLINOIS UNIVERSITY THE GRADUATE SCHOOL OFFICE OF THE DEAN Robert M. Augustine, Dean Phone: 581-2220 1201 Blair Hall Email: [email protected] TO: Council on Academic Affairs RE: Change in Policy/Catalog Copy for Undergraduate International Student Admission DATE: September 22, 2008 Bill Elliott, Director of International Admissions, has recommended the admission policy of the International Applicants section of the Undergraduate Catalog be revised as outlined below and be adopted for Summer 2009 international admissions. The policy statement was reviewed and approved for recommendation to the Council on Academic Affairs by the International Programs Advisory Council on September 5, 2008. Congruent recommendations for graduate admission have been forwarded to the Council on Graduate Studies as pertinent to graduate applicants. Current Policy/Catalog Copy Proposed Policy/Catalog Copy English language mastery for undergraduate English language mastery for undergraduate applicants for admission may be documented in applicants for admission may be documented in one of the following ways: one of the following ways: 1. Submission of a score of 500 or higher on 1. Submission of a score of 500 or higher on the paper-based Test of English as a the paper-based Test of English as a Foreign Language (TOEFL), a score of 173 Foreign Language (TOEFL), a score of 173 or higher on the computer-based TOEFL, a or higher on the computer-based TOEFL, a score of 61 or higher on the internet-based score of 61 or higher on the internet-based TOEFL, or a minimum International English TOEFL, or a minimum International English Language Testing System (IELTS) score of Language Testing System (IELTS) score of 6.0 (academic module).
    [Show full text]