STATE BASED TRANSITION AND MOCK EXAMINATIONS AS PREDICTORS OF STUDENTS’ ACHIEVEMENT IN SENIOR SECONDARY SCHOOL EXAMINATION IN KOGI STATE

BY

OBAJEMU, STEPHEN FOLORUNSO PG/M.Ed/12/63757

DEPARTMENT OF SCIENCE EDUCATION FACULTY OF EDUCATION UNIVERSITY OF , NSUKKA.

NOVEMBER, 2016

TITLE PAGE

STATE BASED TRANSITION AND MOCK EXAMINATIONS AS PREDICTORS OF STUDENTS’ ACHIEVEMENT IN SENIOR SECONDARY SCHOOL EXAMINATION IN KOGI STATE

BY

OBAJEMU, STEPHEN FOLORUNSO PG/M.Ed/12/63757

DEPARTMENT OF SCIENCE EDUCATION FACULTY OF EDUCATION UNIVERSITY OF NIGERIA, NSUKKA.

SUPERVISOR PROF. S. A. EZEUDU

NOVEMBER, 2016

i

APPROVAL PAGE

This thesis has been approved for the Department of Science Education, Faculty of Education,

University of Nigeria, Nsukka.

By

……………………………. ……………………………. Prof. S. A. Ezeudu Dr. (Mrs) Ebere Ibe Supervisor Internal Examiner

……………………………. ……………………………. External Examiner Prof. Z. C. Njoku Head of Department

……………………………. Prof. U. C. Umo Dean, Faculty of Education

ii

CERTIFICATION

Obajemu, Stephen Folorunso, a postgraduate student in the Department of Science

Education with Reg. No. PG/M.Ed/12/63757 has satisfactorily completed the requirements for the award of Masters Degree (M.Ed) in Measurement and Evaluation.

The work embodied in this thesis is original and has not been submitted in part or whole for any other Diploma or Degree of this or any other University.

……………………………. ……………………………. Obajemu, Stephen Folorunso Prof. Ezeudu S. A Student Supervisor

iii

DEDICATION

This work is dedicated to the Almighty God for His favour, grace, strength and the gift of life to be able to embark on a programme like this and to my beloved wife, Obajemu Rachael Fisayo, for her encouragement and support.

iv

ACKNOWLEDGEMENTS

First and foremost, I express my profound gratitude to Almighty God in whose grace and mercy this work was possible. I equally appreciate the effort of my supervisor, Prof. S.A.

Ezeudu, for his support, suggestions and reading through the manuscript of this work in the face of his tight schedule,.Sir, God bless you.

Also worthy of special appreciation are the lecturers in the Department of Science

Education, especially the Head of Department, Prof. Z. C. Njoku, Prof. B. G. Nworgu, Prof. D.

N. Eze, Dr. B.C. Madu for their candid advice support and taking the pain to go through the instrument, Dr.J.J. Ezeugwu, Dr. U. J. Utibe, Mr. Uguanyi Christian for their timely support.

I cannot, but appreciate my readers, Prof. A. A Nwosu, Dr. J.J. Agah and Dr. E. Ibe, for their time taken to critique this work and give to it a better look. May God bless you all. Worthy of special appreciation is my beloved wife, Mrs. Obajemu Rachael F, my loving father who never lived to see the outcome of this work despite all his contributions to the success of this programme. May his soul rest in perfect peace.

The researcher equally thank some of his course mates, especially, Dike Felix, Osinachi

Ochei, Benjamen, Ayozie Chiwendu, Okundu Chidinma, Ella Barnabas, Chibuzor and Thank

God Dieze for their great support. Aleke Nkemdilim is equally appreciated for arranging this work to a befitting state. I appreciate you, I love you all.

v

TABLE OF CONTENTS

TITLE PAGE I

APPROVAL PAGE II

CERTIFICATION III

DEDICATION IV

ACKNOWLEDGEMENTS V

TABLE OF CONTENTS VI

LIST OF TABLES IX

LIST OF FIGURE X

ABSTRACT XI

CHAPTER ONE: INTRODUCTON - 1

Background of the Study 1

Statement of the problem 9

Purpose of the study 9

Significance of the study 10

Scope of the study 12

Research questions 12

Hypotheses 13

CHAPTER TWO: REVIEW OF LITERATURE 14

Conceptual Framework 15

Concept of Examination 15 Concept of Validity 17 Concept of Achivement in some school subjects 21 The concept and meaning of correlation and regression 28

vi

Thoretical Framework 30 Classical Test Theory 31 Theory of regression 33 Review of Related Empirical Studies 36

Summary of Literature Review 48

CHAPTER THREE: RESEARCH METHOD 50 Design of the Study 50

Area of the Study 50

Population of the Study 51

Sample and Sampling Technique 51

Instrument for Data Collection 52

Validation of Instrument 52

Reliability of Instrument 53

Method of Data Collection 53

Method of Data Analysis 53

CHAPTER FOUR: RESULTS 54

Research Question 1 54

Hypothesis 1 56

Research Question 2 57

Hypothesis 2 58

Research Question 3 58

Hypothesis 3 59

Research Question 4 60

Hypothesis 4 61

vii

Research Question 5 61

Hypothesis 5 62

Research Question 6 63

Hypothesis 6 64

Summary of the Results 64

CHAPTER FIVE: DISCUSSION OF RESULT, RECOMMENDATIONS

AND SUMMARY 66

Discussion of the Results 66

Conclusion 71

Educational Implication of the study 72

Limitations of the Study 74

Recommendations 74

Suggestion for further studies 75

Summary of the Study 76

REFERENCES 79 APPENDIX 86

A: Names of the public secondary schools in Mopamuro LGA 86

B: Names of the public secondary schools in Kabba/Bunnu LGA 87

C: Students’ grade chart (SGC) 88

D: Output of data analysis 89

viii

LIST OF TABLES

Tables

1. Correlation matrix of the independent and dependent variables 54

2. Multiple regression analysis of predictor variables with the criterion variable SSCE 55

3. ANOVA of relationship between Maths TRANS, MOCK and SSCE Maths 56

4. ANOVA of relationship between English Language TRANS, MOCK and SSCE English

Language 58

5. ANOVA of relationship between Physics TRANS, MOCK and SSCE Physics 59

6. ANOVA of relationship between Chemistry TRANS, MOCK and SSCE Chemistry 61

7. ANOVA of relationship between Biology TRANS, MOCK and SSCE Biology 62

8. ANOVA of relationship between TRANS, MOCK and SSCE 64

ix

LIST OF FIGURE

Figure 1: Schematic diagram of the relationship between the variables 30

x

ABSTRACT

The purpose of this study was to determine the extent to which Transition and Mock examinations predict the student achievement in SSCE. The design of the study was correlational survey research design carried out in Kabba/Bunu and Mopamuro local government areas in the western senatorial district of Kogi state. The population comprised of 9,677 secondary II students who sat for Transition examination in 2013/2014 academic session and wrote Mock examination in first term SS III. A sample of 520 students from the two local government areas was drawn using proportionate stratified sampling technique. The instrument used for the data collection was students’ grade chart (SGC). It was a standardized pro-forma. It was designed in such a way that the students’ grades could be obtained in STANINE form, that is, A1, B2, B3, C4, C5, C6, D7, E8 and F9 and assigned 9, 8, 7, 6, 5, 4, 3, 2 and 1 respectively.. The questions were answered using descriptive statistics and the hypotheses were tested at .05 level of significance using Regression Analysis of variance (ANOVA). Major findings of the study showed that; (1) Transition and Mock Examinations in Mathematics are good predictors of SSCE Mathematics. Transition and Mock scores accounted for 58% of the variance of students’ achievement in SSCE Mathematics. (2)There was a good prediction of SSCE English Language by Transition and Mock Examinations. Transition and Mock Examinations in English Language predicted 42% of the total variance of the students’ achievement in SSCE English Language. (3) Transition and Mock Examinations in Physics are good predictors of SSCE Physics. Transition and Mock scores accounted for 38% of the variance of students’ achievement in SSCE Physics. (4) The prediction of Transition and Mock Examinations in Chemistry was good. 50% of the total variance of students’ achievement in Transition and Mock Chemistry accounted for students’ achievement in SSCE Chemistry. (5) Transition and Mock Examinations in Biology are good predictors of SSCE Biology. Transition and Mock scores accounted for 37% of the variance of students’ achievement in SSCE Biology. (6) The combination of Transition and Mock Examinations scores jointly accounted for 72% of the variance of students’ achievement in SSCE. Based on these findings, it was recommended among others that Kogi state ministry of Education should ensure that the selection of students for SSCE is maintained while the Transition Examination is sustained and the repackaged Mock Examination is improved upon for better performance in these external examinations.

xi

CHAPTER ONE

INTRODUCTION

Background of the Study

There are examination bodies that are charged with the responsibility of conducting standardized examinations in Nigeria for certification of senior secondary schools. The bodies are, West African Examinations Council (WAEC), the National Examinations Council (NECO),

National Business and Technical Board (NABTEB) and so on.

According to the report made available by WAEC (2015), the University of Cambridge

Local Examination Syndicate, School Examinations Matriculation Council and West African Department of Education met in 1948 concerning Education in West Africa.

Dr. George Barker Jeffery was appointed in the meeting to visit some West African States to see the general education level. The report given by Jeffery (1950) supported the proposal of a West

African Examinations Council and making detailed recommendation and duties of the council.

The report was adopted without reservation by the four West African governments (Nigeria,

Gambia, and ) and an ordinance (WAEC ordinance No 40) establishing the council as a corporate body was drafted by the West African Inter-Territorial secretariat in consultation with the governments.

Nigerian government provided accommodation for the body in 1953. To this effect, a block at the Technical Institute, Yaba was given to WAEC. The four countries that are members are – Nigeria, Gambia, Ghana and Sierra-Leone.

The following are the categories of Examinations conducted by the body;

1) National Examinations – These are restricted to the specific member countries for which they

are developed and reflect their local policies, needs and aspirations.

1

2

2) International Examinations – These are developed for candidates in all the member countries.

The West African Senior School Certificate Examination (WASSCE) is one of such examinations. Students must have at least five credit passes including English Language and

Mathematics to gain admission to study in the university. Because of the importance of these external examinations to students, adequate preparation must therefore be made to help students achieve highly in them.

Different attempts have been made by school administrators to prepare students for these

Examinations. Mock has been the popular form of Examination conducted to help prepare students for these external examinations. But mock examinations conducted both at school and sectorial levels have been criticized by many scholars as being biased and subject to teachers’ manipulations which makes it inadequate to be used to predict the performance of students in external Examinations. Students are not being judged equally as different types of Mock

Examinations are conducted by teachers in various schools and different senatorial districts.

There is no uniformity of examinations taken by students to prepare them for the same external examinations. Standards differ from school to school and from district to district.

The items in these Mock examinations conducted at school and district levels are not standardized to ascertain the psychometric properties of the items. This makes the examination unparallel with the external examinations that are developed in a standard form by experts which have known psychometric properties. The standardization ensures the parameters like; the difficulty index, discriminating index and the distractor index.

To better prepare the students for these external examinations, Kogi State government in

2008 introduced an examination called Transition examination which was borne out of the government’s interest in looking for a reliable ground of preparation for students to perform

3 better in the external examinations. Transition examination is taken in Senior Secondary II (SS

II) in third term. It serves two major purposes of promotion examination to Senior Secondary III

(SS III) and also prepares the students for external examinations. The examination is taken throughout the state and under the same examination conditions as these external examinations and marked centrally by the teachers who indicate interest and are paid like examiners in the external examinations. Transition Examination was also designed to have a motivating feature.

The government’s policy that established the Transition examination in Kogi state includes:

1) The examination is for students transiting from SS2 to SS3 in all government approved

secondary schools in the state.

2) The examination should be called Kogi State Common Transition Examination.

3) All subjects examined by WAEC and NECO and offered in Kogi schools are to be tested.

These include, among others; English Language, Mathematics, Biology, Chemistry and

Physics.

4) The items will be generated by subjects’ specialists teaching the students in schools within

the state.

5) There will be item writing and moderation exercises involving experts. Four subject

specialists per subject will be involved for item generation and moderation.

6) Conduct the Examination;

Ø All schools will be centres.

Ø Vice principals will supervise the examinations in schools other than theirs.

Ø The principals will provide resources for test of practical.

Ø Principals are to provide resources for packaging of worked scripts.

4

Ø Principals are to deliver worked scripts to the Area Inspectorate of Education

(AIE)/custodian after the examination.

7) Marking exercises;

Ø Zonal marking to be coordinated by zonal AIEs and monitored by officials of Ministry of

Education (MOE).

Ø Zonal AIEs will appoint markers.

Ø Zonal AIEs will receive scripts from principals or AIE/custodians.

Ø Zonal AIEs will brief and coordinate team leaders and assistant examiners.

Ø Payment of assistant examiners and team leaders will be at the zonal centres.

8) Processing of results

Ø Collation of results by MOE.

Ø Computerization to be handled by the MOE.

9) Release of results and promotion parameter to be determined by MOE.

Ø At least passes in five subjects including English Language and Mathematics.

Ø Candidates who failed will repeat the SS 2 class.

Ø Candidates who repeated once may be promoted if he/she should fail again.

The government motivates the students by paying the WAEC fee of any student that passed at five credit level including English Language and Mathematics. The objectives of

Transition examination include; i) To organize qualifying examination into Senior Secondary III (SS III) from Senior

Secondary II (SS II) ii) To standardize the quality of students that will present for WAEC.

5 iii) To create a platform for ready candidates for government sponsorship in terms of payment of

WAEC registration, and iv) To curb the menace of examination malpractice in terms of registering external candidates

who are not prepared for the examination (Kogi State Ministry of Education, 2008).

There have been better performances in the Transition Examination that would have been expected to reflect in the students’ performances in the external examinations, but it has not been so. Despite all the government’s efforts in funding this examination and paying the WAEC

SSCE registration fee of students that passed with five credits in subjects including English

Language and Mathematics, the performances of students in external examinations are still poor.

There is a need, therefore, to further prepare students to curb this menace of mass failure in the external examinations.

To better strengthen the preparedness of students for the external examinations, Kogi state government, in year 2012, repackaged mock examination to be conducted centrally and in the same examination conditions as these external examinations. It is taken in the first term of SS III.

The examination papers are equally marked centrally by teachers. Recently, the result of the mock examination was published and given out to students and could be accessed on the internet which is different from the usual practice of keeping the results with the school principals. All these are done by the government to equip the students for the public (eternal) Examinations.

Public examinations are viewed as external school examinations opened to the general public and conducted by examination bodies using tests that have known psychometric properties. These are better developed than the ones prepared by the teachers in the school setting (Adeyegbe, 2004). The examinations are the sense that the examining boards conducting these examinations did not themselves prepare students for the examinations. They are the

6 examinations that are designed and organised under specific terms and conditions and are based on norms that were regarded as standards (Adeyemi, 2008). To know the students’ standing in these examinations, therefore, students must be evaluated.

Evaluation is seen as a qualitative description of students’ behavior. No matter how efficient the teacher is, how intelligent the students are and how adequate the audio-visual equipment, if no provision is made for the evaluation of the students’ progress, the teaching effort may be completely invalidated. Evaluation concerns determining the quality of the curriculum, the facilities available and performance of pupils, using various tools. Test is one of such tools for evaluation (Obinne, 2011). The practical relevance of these tests and their testing is largely dependent on their levels of reliability, validity, difficulty and discrimination. Validity is, according to the Standards for Educational and Psychological Testing, “the most fundamental consideration in developing and evaluating tests” (Hogan & Angello, 2004).

Tests are administered in these examinations (Transition, Mock and external examinations) to know the level of achievement of students. Test could be used as a measuring instrument to predict the academic performance of students in the future. Such test that could be used to predict students’ achievement must be valid.

Validity is the extent to which a test measures what it ought to measure. Validity are of different types. One of the types of validity is criterion-related validity. This type of validity is concerned with the degree to which the performance on an instrument can estimate or predict performance in other situations. The measures from the instrument are referred to as the predictor variable or (predictor), whereas the performance in another situation which the instrument is supposed to predict is called the criterion variable. As related to this study,

Transition and Mock Examinations are the predictor variables while SSCE is the criterion

7 variable. Criterion-related validity is sub-divided into two viz; concurrent validity and predictive validity. Concurrent validity has to do with when the criterion score is obtained about the same time with the test score, that is, when the two scores are gotten concurrently.

Predictive validity is the extent to which a person’s present scores can be used to estimate the future performance. Predictive validity is intended to predict how a person will perform at a later date on a different of assessment of his/her abilities using the performance measures of the present (Garson, 2008). Emaikwu (2011) stated that predictive validity refers to how accurately a person’s current test score can be used to estimate what the criterion score would be at the later time. Mock and Transition examinations are the predictor variables that will be used to predict the performance of students in SSCE (criterion measure). Gall, Gall, and Borg

(2007) were of the view that predictive validity is the degree to which the predictions made by tests are confirmed by the later benaviour of the subjects. They maintained that much educational research is concerned with the prediction of success in various activities.

The aim of the government in Kogi state is to use Transition and Mock examinations to prepare the students to have better performance in the external examinations. Therefore the need to ascertain the predictive validity of these examinations whether they can be used to predict the achievement of students in SSCE. Despite all these efforts by the government, the poor achievement of students in these external examinations has become worrisome and hence the need for this study to see how both Mock and Transition examinations predict the students’ achievement in SSCE. This will give a direction on whether to repackage or discard the examinations and think of a better one.

The researcher used Mathematics, English Languate, Physics, Chemistry and Biology as subjects of prediction because Mathematics and English Language are compulsory subjects in

8 secondary schools and are core subjects which must be passed at credit level for students to be able to pursue their academic career while Physics, Chemistry and Biology are science based subjects. According to the Federal Republic of Nigeria (2013), core subjects are basic subjects that enable a student to offer Arts or science in higher education. Without a solid foundation in

Mathematics, meaningful advancement in science and Technology would not be made.

According to Kolawole and Udoh (2012), Mathematics is the tool in the development of science based knowledge such as technology, industry and, even, for sound analytical reasoning in daily living in the present age. Also without a solid foundation in English Language, students may not be able to make a head way in art related courses.Adesoji (2008) asserted that abilities in English

Language do influence knowledge of students in other subjects in the curriculum. Physics,

Chemistry and Biology are the major science subjects that qualifies and enable students to study science related courses in the Nigerian higher institutions of learning (FME, 2013).

For students to perform better in an examination that will determine how far they can go in life, in terms of education therefore, the predictive validity of such an examination ought to be ascertained. Hence the need for this study since it was discovered that no research work had been carried out, to the best of the researcher’s knowledge, on the predictive validity of transition examination since its inception in Kogi state in 2008 and Mock Examination when it was reparkaged in 2012.

Statement of the Problem

Generally, mock examinations are used to predict students’ performance in the main examination the transition examination serves the same purpose. Mock examination as a trial examination is selective , predictive and diagnostic in nature which shows how successful teachers’ instructions have been mastered. Prediction can not be established if the examination

9 that is to serve as predictor does not have close relationship, in terms of the correlation coefficient, with the criterion variable. The value of the correlation coefficeint between the criterion and the predictor variables could qualify it, or otherwise, as an examination that can actually serve as the basis of prediction. In other word, it is when the predictive validity is ascertained that the examination can be reliably used to predict future occurrence. Despite the introduction of these Examinations (Transtition and Mock), the achievement of students in the external Examinations is still worrisome. Since the introduction of the transition examination in

Kogi State in 2008 and Centralized Mock Examination in 2012, no study has been carried out, to determe their predictive validities suggest whether to continue using them to prepare the secondary school students for their final examination in either WAEC, NECO or NABTEB conducted examinations. Hence the need for this study as it seeks to determine the predictive validity of these examinations. The question, therefore, is what is the predictive validity of

Transition and Centralized Mock Examinations in Kogi state?

Purpose of the Study

The purpose of this study is to determine the extent to which Transition and Mock

Examinations predict the achievement of students in SSCE. Specifically, the study seeks to;

1) Determine the extent to which Transition and Mock Examinations in Mathematics predict

students’ achievement in SSCE Mathematics,

2) Determine the extent to which Transition and Mock Examinations in English Language

precict students’ achievement in SSCE English Language,

3) Determine the extent to which Transition and Mock Examinations in Physics predict the

achievement of students in SSCE Physics

10

4) Determine the extent to which Transition and Mock Examinations in Chemistry predict

the achievement of students in SSCE Chemistry

5) Determine the extent to which Transition and Mock Examinations in Biology predict the

achievement of students in SSCE Biology.

6) Determine which of the of the Examinations, Transition or Mock, best predict the

achievement of students in SSCE.

Significance of the Study

The findings of this study would have both theoretical and practical significance.

Theoretically, the findings of this study will support the theory of regression. In regression, several pairs of values of two variables; say X and Y are usually taken on the same individuals as indicated in the equation Y= a + bX. Where Y is the dependent variable or the variable to be predicted, X is the predictor or independent variable and a and b are constants. The constant are usually calculated using the values of X and Y . When only two variables are involved (X and Y), the regression is said to be simple. But when more than two variables are involved, it is called multiple regression. In this study, Transition and Mock Examinations are the predictor variables which could Classical Test Theory (CTT) which be denoted as X1 and X2 respectively, while the SSCE will be the criterion variable (Y).

Practically this study is significant to; the school administrators, researchers, the examination bodies, students and teachers, government, parents and guardian. Practically, the school administrators through it would know which of the Examinations (Transition and Mock) best predicts the achievement of students in SSCE. This will help in improving on the level of instruction in terms of teaching, how to formulate and implement policies that will enhance teaching and learning for an improvement in the Educational sector.

11

Other researchers will benefit from the result of this study as it serves as a reference point to other researchers. This will form the basis upon which they would stand as an existing study on the predictive validity to enrich their work or the take off point for them.

Moreover, the outcome of this study will help the teacher and students alike to predict the academic achievement in SSCE. Students can be encouraged to be more serious with their studies while the teachers take the exams (transition and mock examinations) seriously with the students.

It can be an eye opener to the government whether to continue funding the examination and motivating the students by paying their WAEC fees in Senior Secondary School III (SS III) so as to encourage them study hard and attain a greater height in their academic pursuit.

The findings of this study will equally be of great benefit to the society at large as it will encourage both parents and guardians to monitor and keep watch over their children to use the

Transition and Mock Examinations as preparatory grounds for their children and wards.

Scope of the Study

The study is delimited to two core subjects and three science subjects as Mathematics,

English Language, Physics, Chemistry and Biology . It is also delimited to the 2013 Transition

Examination in SSII, 2014 Mock Examination in the beginning of the session in SSIII as well as

2014 WAEC SSCE in the subjects in Kogi State. Using all the subjects will be cumbersom.

Research Questions

The following research questions were posed to guide the study.

i) To what extent do Transition and Mock Examinations in Mathematics predict the

acheivement of students in WAEC SSCE Mathematics?

12

ii) To what extent do Transition and Mock Examinations in English Language predict the

acheivement of students in the WAEC SSCE English Language?

iii) To what extent do Transition and Mock Examinations in Physics predict the

achievement of students in WAEC SSCE Physics?

iv) To what extent do Transition and Mock Examinations in Chemistry predict the

achievement of students in WAEC SSCE Chemistry?

v) To what extent do Transition and Mock Examinations in Biology predict the achievement

of students in WAEC SSCE Biology?

vi) Which of the Examinations, Transition or Mock, best predict the students’ achievement

in WAEC SSCE?

Hypotheses

The following null hypotheses were tested at .05 level of significance;

i) Performace in Transition and Mock examinations do not significantly predict

performance in WAEC SSCE Mathematics.

ii) Performace in Transition and Mock examinations do not significantly predict

performance in WAEC SSCE English Language.

iii) Performace in Transition and Mock examinations do not significantly predict

performance in WAEC SSCE Physics.

iv) Performace in Transition and Mock examinations do not significantly predict

performance in WAEC SSCE Chemistry.

v) Performace in Transition and Mock examinations do not significantly predict

performance in WAEC SSCE Biology.

13 vi) There is no significance difference in the predictive validity of Transition and Mock

Examinations.

CHAPTER TWO

REVIEW OF LITERATURE

This chapter reviews literature found pertinent to this study. The review is organised under the following headings;

Conceptual Framework.

Concept of Examination

Concept of Validity

Concept of Achivement in some school subjects

The concept and meaning of correlation and regression

Thoretical framework

Classical Test Theory

Theory of regression

Empirical Studies.

Predictive validity of Mathematics and English Language

Predictive validity of MOCK – SSCE in practical Chemistry

English Language and Mathematics as a predictors of students’ achievement in Physics

Summary of Literature Review.

14

15

Conceptual Framework

Concept of Examinations

The term Examination means different things to different people. Examination is a means by which students’ knowledge and understanding are assessed and the basis for awarding cerificate (Orji, 2013). The periodic testing and measurement of students after acquisition of experience is known as examination (Olusola, 2006). Olusola went further to state that

Examination is the pivotal point around which the whole system of education revolves and the success or failure of the system of examination is indeed an indicator of the success or failure of that particular system of education. The students are evaluated at the end of each learning programme or level of education. At the classroom level, the results are used in the form of rewards. These help students to effectively achieve as well as try as much as possible to maintain their level of achievement while those who achieved low will struggle to improve on their level of achievement (Harbor-Peters, 1999). Examination is an instrument that brings out the inbuilt abilities in learners.

Examination elicits responses from learners to place them appropriately on the ability scale. With the introduction of the Item Response Theory (IRT), it is possible to measure the level of ability of the learner. This is opposed to the old idea that an arbitrary ability scale can be used since it is difficult to measure the level of the latent trait in a person (Frank, 2001). Because of the significant role that examination plays in securing the future career of students (good credit passes are required for furthering to higher institution of learning) many want to make it at all cost, and some under tension. All these have resulted into various forms of Examination malpractices that we have in our society today. Evidences abound that both the examiness and the examiners regard examination or any form of test as anxiety provoking process. Students and

16 parents alike fear and are always tensed because examination results affect the plan for individual career. The role of testing and examination include; to award certificates, to give progress reports to parents on their wards’ intellectual abilities, to promote students to new classes or to seek admission into higher institution of learning, to identify and adjudge those suitable for employment and to monitor, govern, and socialize individuals into a progressive and highly knowledgeable society (Abdullahi, 2009).

Examination could be external or internal, formal or informal. An example of a formal examination is the final examination administered by the teacher in a classroom while informal examination could be a form of examination conducted by a parent on the child. Examination could be in written form, multiple choice, completion type, essay, true/false e.t.c. The two basic assumptions of any examination worth its name are that (a) it should be valid and (b) it should be reliable (Zakka, 2014). The importance of examination can therefore, not be over emphasized as it is a major instrument in determining the level of ability of an individual.

In this study, however, the data that will be used for this work is as a result of a form of formal Examinations conducted by the state government, these are the Transition and Mock

Examinations, and the External Examination conducted by the West African Examination

Council (WAEC). The responses of the students in these Examinations are recorded as the ability of the students which placed them on an ability scale in form of grades. This makes it possible for the researcher to be able to predict the future achievement of students. Therefore the relevance of this concept in this study.

Concept of Validity

Validity refers to how well an instrument measures what it intends to measure for the purpose of descision and prediction. Nworgu (2006) defined validity as the extent to which a test

17 measures what it is supposed to measure. In which case, a test is designed for a particular aim to be achieved. The aim is defeated if it (test) does not meet up with that aim. According to Eze

(2003), unauthentic or inaccurate data will lead to wrong decisions which will in turn adversely affect the learner. Gall, Gall, & Borg (2007) stated that a set criterion must be on the ground upon which test must be predicted without which test result could be misinterpreted and have adverse effect on the examinee and the user of the result in making a useful analysis.

Determining the validity of a test is one way of gathering information from such test to ensure that the test is accurate. Understanding of the concept of test validity is important, because it is the most important criterion, for it determines whether the test truly measures what it is expected to measure (Walk, 2005).

To Gall, Gall and Borg (2007) there are badically four types of validity which include; content, concurrent, predictive and construct validities. In their own view, concurrent and predictive, based on the issue at hand, can be grouped together to be criterion-related validity as the two focus on the effectiveness of a test to measure an individual’s behaviour on other variables called criterion. Eze cited in Nworgu (2003) opined that the two validities are not the same since the score upon which prediction is made had been obtained long before the predicted

(criterion) score, but in concurrent validity both scores are obtained almost at the same time.

Validity of a test is the extent to which a test performs the purpose for which it was designed and administered. It is the accuracy of the inferences and interpretations we make from test scores. It is a matter of extent or degree. That is, conclusions on the validity of a test or inferences are not absolute ones rather, they are relative, such that a test may be valid for a purpose and not valid for another (Ejigbo, 2014).

18

Validity is of different types. These are; content validity, criterion related validity, construct validity and face validity. Content validity may be defined as the extent to which a test measures a representative sample of the subject matter content and the behavioural change under consideration. The content validity is essentially concerned with the adequacy of the sample with respect to the content that is of primary concern in achievement testing. This form of validity is usually built into the instrument during its process of development making use of test blue prints

(Adonu, 2014).

Criterion related validity is another type of validity that deals with the extent to which measure from a test is related to an external criterion. The measure from the test or instrument is the predictor where the external behaviour the test is predicting is known as the criterion.

Criterion related validity has sub-classification based on the time interval between the predictor variable and the criterion variable. These are; concurrent validity and predictive validity

(Nworgu., 2015).

Construct validity is involved whenever a test is to be interpreted as a measure of some psychological construct, attribute or quality that are assumed to be represented in a test performance to explain some psychological theory. The psychological construct such as intelligence, anxiety, creativity are considered hypothetical construct because they are not directly observable but are rather inferred on the basis of their observable effect on the behaviours (Emaikwu, 2011). Construct validity is of two forms, that is, convergent validity and discriminant validity. Convergent validity occurs where measures of constructs that are expected to correlate do so. This is similar to concurrent validity (which looks for correlation with other tests). Discriminant validity occurs where constructs that are expected not to relate do not, such that it is possible to discriminate between these constructs. Convergence and discrimination are

19 often demonstrated by correlation of the measures used within constructs. Convergent validity and Discriminant validity together demonstrate construct validity (William, 2006).

The ability of test (measuring instrument) to state appropriately the likelihood of an event to take place in the future is therefore refered to as predictive validity. In this study the

State Based Transitional Examination and Mock Examination are the predictors while the SSCE score stands as the criterion score. The subjects of focus are Mathematics English Language,

Physics, Chemistry and Biology.

Predictive validity, therefore, is a measurement of how well a test predicts future performance. It is a form of criterion validity in which how well the test works is established by measuring it against known criteria (Emaikwu, 2011). In order for a test to have predictive validity, there must be a statistically significant correlation between test score and the criterion being used to measure the validity.

A valid test therefore will measure what it is supposed to measure and gives consistent result. Hence the factors that affect reliability of an instrument would affect predictive validity.

These factors that could affect predictive validity include time, number of items in the test, the reliability of both tests (predictor and criterion measures), test difficulty level at both pre-test and post-test, the relationship between the variables that are being measured (Okwudili, 1996).

Concept of Achievement

Academic achievement of students can be defined as the extent to which a student or learner has attained a level of educational aim or objective. According to Epunam (1999), academic achivement of student is defined as the learning outcomes of the students which may include the knowledge, skills and ideas acquired and retained through a course of study both

20 within and outside the classroom situations. Rohwer, Rohwer, and B-Howe (2000) defined students’ achivement as the ability to perform with adequacy and excellence as measured against specific standards of attainment. To Ogbu (2008), achievement in education refers to success of students in learning specific curriculum content who further stated that paper-pencil test, called achievement test, set specifically to cover the taught curriculum content is the measuring instrument. Generally, tests are used to elicit responses from learners to measure the level of attainment of the set educational aims and objectives or goals. Assessment should be done periodically to ascertain if certain educational objectives are being achieved to assist operators of education to decide whether their current approaches be developed, improved upon or discarded

(Ezeudu , 1997). There are several factors that are militating against the achievement of students.

These may include; the quality of teachers, individual difference (in terms of IQ), socio- economic factor, learning environment amongst others.The researcher seeks to investigate into how an achievement in examinations (Transtion and Mock Examinations) predicts the achivement in another examination (WAEC) using the already gotten result as an instrument.

The Concept and Meaning of Correlation and Regression

Many scholars have attempted the definition of correlation and have defined it in their own ways but all are pointing towards the same direction of relationship between two or more variables. According to Barry and Brooke (2004), correlation is useful when you are measuring the same variable on two different occasions (talking about test-retest reliability) or when comparing one part of the measure with another part (internal reliability). Robert and John

(2003), defined correlation as an association between variables. It is applied when the data measure is at interval or ratio scale in investigating the association between them. According to

Nworgu (2006), correlation is the statistical technique for establishing the extent of relationship,

21 association or co-variation between two or more variables. The statistical indexes of relationship are called coefficients of correlation denoted by ‘r’ which ranges from +1 and -1 inclusive.

According to Nworgu (2006), regression analysis is correlation based statistical technique for making predictions. In other words, regression analysis takes off from where correlation stopped in going further to show more than a relationship into predicting future occurrence of such relationship. The introduction of the coefficient of determination (R2) helps to explain the percentage change that can be due to the independent variable. In regression analysis, the method

2 of least squares helps in reducing error of measurement. It is denoted by (Y −Y ) where ‘Y’ is the predicted value and 'Y 'is the actual value. This study wants to know whether or not Transition and Mock Examinations predict the students’ achievement in SSCE. If yes, to what extent do they predict the examination (SSCE). This therefore makes the theory of regression so pertinent in this research work.

22

SCHEMATIC DIAGRAM OF THE RELATIONSHIP BETWEEN THE VARIABLES

TRANSITION MOCK

MATH ENG PHY CHEM BIO MATH ENG PHY CHEM BIO

MATH ENG PHY CHEM BIO

SSCE

The predictor variables are the Transition and Mock Examinations while the criterion

variable is the SSCE. The subjects of prediction are; English Language, Mathematics, Physics,

Chemistry and Biology. Transition Examination is taken in SSII which could improve the

students’ performance in Mock Examination that is taken in the early part of SSIII. As one of the

objectives of Transition Examination, it could directly predict the achievement of students in

SSCE. Mock Examination that is taken two months before the SSCE prepares students for the

main Examination which could also predict the students’ achievement in SSCE.

Theoretical Framework

A given study can not stand in isolation. There must be a body of related ideas or

concepts that guide the study. That is,theories within the discipline that underscores the study.

According to Eze (2011), theoretical significance in a given study is a collection of interrelated

23 concepts that guide the researcher to determine what to measure and the statistical relationship to look for.

Classical Test Theory

Classical Test Theory is known to be an examination of the early 20th century approach to measurement of individual differences. It was propounded by Charles Spearman in 1904. It was later worked upon by Novick in 1966. The detailed modern treatment was given by Lord and Novick in 1968.

The Classical Test Theory (CTT) is the first theory of measurement. The rlationship between the true score and the observed score is mathematically expressed as follows;

X =T + E where;

X = observed score

T = true score and,

E = error

The assumptions of .Classicat Test Theory are as follows;

1. For each person in the population, their observed score equals true score plus error.

2. The mean of all errors equal zero

3. Random errors are uncorrelated with true scores

4. The variance of error components from two tests is zero in the population (i.e error from

two tests are uncorrelated)

5. Errors in one test are uncorrelated with true score in another (i.e measurement errors are

not dependent on traits, they are random).

The Classical Test Theory is a body of related psychometric theory that predicts outcome of testing such as the difficulty of items or the ability of test taker. It is related to this study as the

24 study seeks to investigate how the performance of students in SSCE could be predicted using both Transition and Mock Examinations.

One of the tenets of Classical Test Theory is that it is sample dependent. It ignores individual’s response pattern to each of the items. It measures properties of test takers. It is also related to this study because the researcher seeks to investigate the achievement of the sample of students who took Transition, Mock and SSCE Examinations.

Theory of Regression

The theory of regression was propounded in 1875 by Galton. The objective of regression is to predict a criterion variable from predictor variable(s) and to develop an equation that quantifies how each predictor is related to the criterion. The equation is intended to explain the relationship between the criterion variable and predictor variables. Hinkle, Wiersa and Jurs

(1988) states that regression provides a mathematical equation that enables the prediction of scores. It involves a straight line that expresses a finctonal relationship between two variables.

The equation is Y= bX + a. where

Y= predicted score; b= the slope of the line; a= the intercept which is a constant and X= the quantity of the predictor variable. The main aim of the regression analysis is to minimize the amount of prediction errors so as to maximize the relationship between the predictor and the criterion variable.

On the other hand, Field (2005) stated that multiple regression analysis, which is based on the principles and foundation of the simple linear regression, except that for every extra predictor that is included, there is a coefficient and the criterion (outcome) variable is predicted from a combination of all the variables multiplied by their respective coefficient plus a residual

25

error term. Thus the mathematical equation becomes Y= bo + b1X1 + b2X2 + …+bnXn + E where,

Y= outcome variable a0= regression constatnt b1= coefficient of the first predictor (X1) b2= coefficient of the second predictor (X2) bn= coefficient of (Xn) and E= error term.

In this study, that seeks to investigate how Transition and Mock Examinations could predict the achievement of students in SSCE, the regression equation would be Y= a + b1T + b2M. where Y= the criterion variable (SSCE) a= the intercept b1 and b2= the slopes of T and M respectively

T= first predictor variable (Transition Examination)

M= second predictor variable (Mock Examination)

Review of Related Empirical Studies

Studies on Predictive Validity

Several researchers have carried out studies on the predictive validity on the students’ achievement in an external examination. Kolawole and Ala (2013) conducted a study on the predictive validity of continuous assessment scores on students’ performance in Mathematics in some selected states in the South-West Nigeria. The purpose of the study was to examine the predictive validity to determine the relationship and effects of Continuous Assessment Scores

(CA) on the performance of students who sat for the Senior School Certificate Examination conducted by National Examinations Council (NECO) using June/July 2010 Senior School

Certificate Examination (SSCE) as case study. The study was carried out in South-West Nigeria

26 which comprised of Lagos, Oyo and Ekiti states. A Sample of 1,807 was randomly selected from the population of 245,935 who sat for June/July 2010 Senior School Certificate Examination conducted by NECO. The instrument used to elicit information from NECO was tagged: Format of schedules. Data used were analyzed using descriptive statistics of means, standard deviation and inferential statistics of regression analysis. The finding showed that there was a significant, positive but low influence of actual Aggregate continuous Assessment scores (AACA) on the final score. Kolawole and Ala (2003) went further to state that the Moderated Aggregate

Continuous Assessment Scores (MACA) was influenced by the moderation procedure. MACA was found to have significant but negative effect on the final grade.

Achor, Kurumeh, and Orokpo ( 2012) carried out a study on gender Dimension as predictors of students’ performance in Mock-SSCE practical and theory Chemistry Examinations in some secondary schools in Nigeria using a sample of 128 students selected from 15 out of 33 schools in Ogbadibo local Government Areal of Benue state. The purpose of the study was to find out how the students’ alternative to practical knowledge of chemistry and their tests of theoretical knowledge of chemistry predict their performance in chemistry in MOCK-SSCE. The researcher developed the Students’ Alternative to Test of Practical Knowledge of Chemistry

(SATPKC) and the Students’ Test of Theoretical Knowledge of Chemistry (STTKC) as instruments for data collection. The data was analysed using multiple regression analysis. The result showed that male as well as female students’ performance in a test of theoretical knowledge in Chemistry do not significantly predict their performance in MOCK-SSCE

Chemistry theory examination. The study further showed that SSII female students’ performance in the test of theoretical knowledge in chemistry could not significantly predict their performance in Mock-SSCE chemistry practical examination.

27

A related study was carried out by Omirin and Ale (2008) on the predictive validity of

English and Mathematics Mock result of students in WASCE in Ekiti State, Nigeria. The study was conducted to find out whether or not the mock examination predicts the students’ performance in the SSCE. Simple random sampling technique was used to select three hundred and sixty students from 12 public schools in six local government areas in Ekiti state. Ex-post- facto type of research design was used as the study was carried out using the already existing data from the students’ performance in these exams. The instruments used for this research work were standardized tests administered and scored by WASCE and the raw scores of mock scored by the teachers in their various schools. These results were collected in English Language and

Mathematics. Correlation and regression analysis were used to analyse the data. The result of the finding revealed the degree of prediction of mock over WASCE English and Mathematics was significant. This is to say that mock English and Mathematics significantly predict good performance in WASCE.

This study is highly related in that it was on the predictive validity and used Mathematics and English Language for the prediction. It is however different in the sense that it was not carried out on the Transition and Mock Examinations, the additional three science subjects were not included and the area of the study was not Kogi state but Ekiti state.

In the study conducted by Adesoji (2008) on into English language and mathematics

Mock Results as predictors of performance in SSCE physics, the target population was all the students that sat for 2008 May/June WAEC examination in Physics, English language and

Mathematics. One hundred and fifty students (150) were randomly sampled from five secondary schools in Shomolu and Mainland local government areas of Lagos State. The researcher collected the final SSCE results of the students in Physics in the 1998 academic year with the

28 mock results in English Language and Mathematics for the same year. The data was analysed with stepwise multiple regression (backward procedure) to examine the relationship between the two independent variables (Mock English Language and Mathematics) and the dependent variable (Senior School Certificate Examination grades in Physics). The result showed that the multiple correlation of Physics with weighted combination of English Language and

Mathematics was significant. The conclusion was that the results of students in mathematics could be used to predict their results in physics, but that of English language were found irrelevant in predicting the students’ performance in physics.

Related study was carried out by Innocent, Gladys, and Onyiyechi (2015). The study investigated students’ Mathematics and English Language Mock achievement as predictors of school certificate performance in Physics. The study was carrird out in Obio/Akpor Local

Government Area of Rivers state. The sample of 250 students was randomly selected out of 600 students who sat for the senior school certificate Examination in Physics, English Language and

Mathematics in 2010 May/June WAEC Examination. The researcher used cheklist as the instrument for data collection and the result was analyzed using Pearson Product Moment

Correlation coefficient, Analysis of Vaiance (ANOVA) and multiple regression coefficient. The result showed that there ia a positive relationship between the performance of students in English

Language, Mathematics Mock and their performance in Physics SSCE.`

The study investigated into how Mock examination predicts the students’ performance in

SSCE. It is similar to the present study in the area of using Mock Examination to predict students’ achievement, the use of Correlation coefficient and Analysis of Vaiance (ANOVA) to analyze the data. Though the area of divergence is obvious as it used Mathematics and English

Language to predict the achievement in Physics, the present study seeks to know how

29

Mathematics, English Language, Physics, Chemistry and Biology Transtion and Mock

Examinations predict the achievement of these subjects in SSCE.

Nweze (2013) also carried out a similar study on university Matriculation Examination

Entrance Examination scores in chemistry as predictors of Achievement in chemistry-Based courses in public universities in Enugu State. The main purpose of the study was to examine the

University Matriculation Examination (UME), and post UME scores in chemistry as predictors of students’ first year university achievement in the chemistry related courses in different universities. The area of the study covers the public universities in Enugu State which are;

University of Nigeria, Nsukka (UNN) and Enugu State University of science and Technology,

Enugu (ESUT). A total number of 1,370 students were sampled out of 2,396 who registered and wrote post-UME examination from both the University of Nigeria, Nsukka and Enugu state university for 2007/2008 and 2008/2009 academic sessions. A designed format was used to collect data of UME chemistry result and the first year academic performance in chemistry related courses. Data collected was analyzed using multiple regression analysis. The result showed that; the university Matriculation Examination scores in university does not predict the students’ achievement in chemistry-Based courses in the public universities except in industrial chemistry. Post-UME chemistry scores predicted first year chemistry scores in all the courses in

Federal University, but only predicted students’ first year score in Electrical Electronic in state university. Combined UME/post-UME chemistry scores predicted first year score in all the courses studied in the Federal University, but predicted their scores in Electrical Electronics in the State University. UME predicted post-UME scores in Biochemistry and industrial chemistry in Federal University, but no relationship in Agricultural Economics, Electrical Electronic in

30 state University. UME scores only predicted post-UME scores in Biochemistry and Electrical

Electronic, but did not predict in other courses studied.

Onyebuenyi (2009) carried out a study on ethnicity, gender and socio-economic status as moderator variables in the predictive validity of centralized mock examination. The study was carried out in the South-south geopolitical zone of Nigeria. The population comprised of 877 students (Igbo, Hausa and Yoruba) in senior secondary three (SS III) students in the 12 Federal

Government Colleges in the zone who sat for the centralized mock senior secondary certificate examination and SSCE in 2005. Disproportionate stratified random sampling technique was used to draw 474 SSIII students from the 12 colleges in which 100% of the Hausas and Yorubas were used as well as 50% of the total number of Igbo students.

Socio-Economic Status Questionnaire and the Students’ Grade Chart were used to collect data from the 12 colleges. Data collected was analyzed using correlation, t-test and regression analysis. The result of the study revealed that the validity coefficient obtained was 0.56 which, according to the researcher, predicted more than 50% of SSCE. The variation could be as a result of the reliability of the Centralized Mock Senior Secondary Certificate Examination and SSCE as well as motivation of students to maintain their mock scores in SSCE. It was also found that the centralized mock senior secondary certificate examination significantly predicted achievement in senior secondary certificate examination, gender does not significantly moderate the prediction of students achievement in SSCE based on centralized mock senior secondary certificate examination. There was significant difference due to socio-economic status in the predictive validity of centralized mock senior secondary certificate examination and ethnicity did not significantly moderate the predictive validity of students achievement based on centralized mock senior secondary examination.

31

Another study was carried out by Obije (1995) seeking to establish the relationship between electronic students’ entry qualification and their performances in West African

Examinations Council technical examination. Also it determined whether there is any significant difference among electronic students- those with science background and those with arts background at the West African Examinations Council examination in technology. The researcher sampled two hundred and nine (209) electronic students who passed out from Delta state Technical College in 1990, 1991 and 1992 including eighteen electronic teachers for the study. Students’ entry qualification grade average point (GAP) and their performance were collected and analyzed using Pearson moment correlation, Mean scores and Analysis of variance

(ANOVA). The result of the analysis revealed that there was positive significant relationship between the following;

(i) Electronic students’ entry qualification grade average point and their performance at

West African Examination Technical

(ii) Electronic student performance at West African Examinations Council Technical

examination and their last internal school examination.

(iii) The mean performance of these categories of electronic students at the West African

Examinations Council. Technical and

(iv) Students with science background perform better than those with art background.

He concluded that those students with good grade points in their school certificate had an improved performance in these examinations.

Obioma and Salau (2007), determined the extent to which scores in the West African

Examination Council Senior School Certificate (WASSCE), National Examinations Council

(SSCE) and National Business and Technical Examination Board NBCE/NTCE in conjunction

32 with the Joint Admissions and Matriculation Board (UME) predict future academic achievement of students in university degree examinations. The research was carried out in 23 Nigerian

Universities in the six geopolitical zones of the country. Twelve (12) Federal Universities, eight

(8) State Universities and three (3) Private Universities were used. The population comprised of all students who graduated from the Nigerian universities as at December 2005. Four thousand nine hundred and four (4904) candidates were randomly selected from the total number of the candidates who satisfied some predetermined criteria out of which 2631 were males and 2273 were females. Data were collected from the universities sampled with the help of research assistant on the basis of possession of a minimum of five credit passes at one or two sittings in the WASSCE, SSCE and NBCE/BTCE and had completed their university degree programmes as at December 2005. Data analysis was done using Pearson Product Moment correlation coefficient and foward inclusion stepwise multiple linear regression analysis. The result showed that there was low but, positive relationships between each of the predictor variables under study.

Obioma and Salawu stated that although, generally public examinations poorly predicted students’ university academic achievement, when compared individually with other predictors,

WASSCE was the best single predictor of the students’ Cumulative Grade Point Average

(CGPA).

This study however, is similar to this work in the area of investigating into the predictive validity of exams to determine the performance of the students and using correlation and regression analysis to analyze the data. It sought to investigate into how WASSCE, SSCE,

NBCE and JAMB predict students’ achievement in the university education and this study is on how Transition and Centralized Mock Examinations predict the achievement of students in

SSCE.

33

In a related study, Ale and Omodara (2015) investigated the predictive validity of unified

Examination for Academic performance in senior secondary school certificate examination in

Ekiti State. The unified examination that is taken centrally by all students in third term by all

SSII students for selection of the best candidate to sit for SSCE. The population of the study comprised of all SSII students who participated in unified examination in Ekiti State in year

2010/2011 and participated in Senior Secondary School certificate examination in the year

2011/2012. Four hundred students were randomly selected from four secondary schools. The instrument for data collection was a self made pro-forma and the data was analyzed using

Regression analysis and Chi- square. The result showed that the unified examination did not predict academic performance of students in the senior secondary school examination. It also revealed that students in boarding schools performed better both in unified and SSCE examinations. This study is closely related to the present study as it predicted the SSCE performance through Unified examination which is similar to Transition Examination because both Examinations are taken in third term in SSII as promotional examination. Pro-forma were the instruments for data collection as well as regression analysis for analyzing the data. It somehow differs in the sense that the present study is on Transition and Mock Examinations and not unified examination apart from the fact that it was carried out in Ekiti State and not in Kogi

State.

Faleye and Afolabi (2016) investigated the predictive validity of Osun State Junior

Secondary Certificate Examination. The purpose of the study was to find out whether there is significant relationship between the overall performance of students in the JSCE and their performance in the senior certificate Examination (SSCE). The population of the study comprised of all secondary schools in Osun state. Sample of 505 students from six purposefully

34 selected secondary schools in out of which three were the top three school of science and the other three from public secondary schools were selected for the study. Students whose results were obtained from 1993 JSCE through SSSI, SSSII and WAEC’s SSCE were considered for the study. Six subjects including English Language, Mathematics, Integrated Science, Yoruba

Language, Social Studies and Agricultural Science were used for the purpose of comparison in which Biology and Chemistry were combined together for Integrated Science, Geography was matched with Social Studies. The data was analyzed using correlation analysis (Pearson r). The result indicated that the JSCE is a poor predictor of students’ performance in the SSCE, but

JSCE English Language and Mathematics had a greater capacity to predict performance of students in SSCE English Language and Mathematics than all the other subjects.

Edokpayi and Suleiman (2011) carried out an investigation into students’ Integrated

Science achievement as predictor of later achievement in Chemistry among selected secondary schools in Zaria metropolis. The purpose of the study was to determine whether or not JSC could effectively predict students’ performance in Chemistry in the SSC Examinations among some selected schools in Zaria metropolis. The population comprised of 400 secondary school students from which two hundred students were randomly drawn from four secondary schools in Zaria metropolis. The students that have results in Junior School Certificate Examination in Integrated

Science and Senior School Certificate Examination in Chemistry in the year 2005/2006 and

2008/2009 respectively were considered for the study. The data was analyzed using the Z- test statistics and the Pearson’s Product Moment Correlation Coefficient. The result revealed that the students’ academic achievement in Integrated Science cannot be used as a predictor of their later achievement in Chemistry.

35

Another study was carried out by Adeyemi (2008) on predicting students’ performance in senior secondary certificate Examinations in Ondo State, Nigeria. The purpose of the study was to determine how effective the performance of students in the JSC examination in predicting the performance of the same students in the SSC Examinations. The population of the study comprised of all the 257 secondary schools that presented candidates for the year 2000 Junior

Secondary Certificate (JSC) Examination in Ondo State. The population was made up of 110

Urban school and 147 rural schools. It also embraced all the 13 single-sex schools and 244 mixed schools. A sample of 218 schools, 94 urban and 124 rural from 2000 to 2003, was drawn from the population using stratified random sampling technique. The instrument used for the data collection was an inventory. The data was analyzed using Z- test, correlation and Regression analysis. It was found out that JSC Examination is a moderate predictor of academic performance of students at the Senior Secondary Certificate examination in Ondo State.

Summary of Literature Review

The review of the literature focused on both the theoretical and the empirical studies. It showed that examination is an important tool in measuring the achievement of students. Students are located on the achievement scale as a result of their scores in a given examination. The literature was reviewed putting into consideration; the concept of Examination, concept of validity, concept of achievement in some school subjects, concept and the meaning of correlation and regression, classical test theory, theory of regression and previous studies on predictive validity.

Under the empirical review, related research works, their methodologies, instrument for data collection, sampling and sampling techniques, method of data analysis and their major findings were highlighted.

36

The review of the relevant literature made it clear, like any other similar work, that the appropriate design is correlational survey. Also the correlation coefficient and Regression

Analysis of Variance (ANOVA) that will be used for this study agree with the body of knowledge through the reviewed literature.

Differences exist in the sense that none of the work was carried out in Kogi state aside the fact that Transition Examination was not used as the predictor variable in any of the studies.

Mock Examination was used by some of the researchers, but not the one that was conducted in

Kogi state and the subjects of prediction were not in the five subjects that this study used. Also this study did not give a long time frame such as could affect the result of the study unlike some did in the review of the related studies on predictive validity using SSCE, JAMB result and

NBCE to predict students’ achievement in the final year examination in the University. It might not be appropriate enough as the time lag is so wide, and many other intervening variables might have set in (e.g age, exposure, change of environment etc) which may be attributed to the achievement of students.

CHAPTER THREE

RESEARCH METHOD

This chapter is discussed under the following sub-headings; Design of the study, Area of the study, Population of the study, Sample and sampling technique, Instrument for data collection, Validation of instrument, Reliability of instrument and Method of data analysis.

Design of the Study

This study adopted corrolational survey research design. According to Nworgu (2015) this type of study seeks to establish what relationship exists between two or more variables. It equally indicates the direction and magnitude of the relationship between the variables. In the present researcher examined the relationship that exists between Transition and Mock

Examinations and SSCE. The Transition and Mock Examinations are the predictor variables.

While SSCE is the criterion variable.

Area of the Study

This study was carried out in Kogi state. Kogi is a state in the north-central zone of

Nigeria. It is popularly called the confluence state due to the fact that the confluence of Rivers

Niger and Benue occurs there. Its capital is an ancient town called Lokoja. It has a total land area of 28,313.53 square kilometres and a projected population of 3.3 million people. It lies on latitude 7.490N and longitude 6.450E. Kogi state is made up of three senatorial districts (Western,

Central and Eastern senatorial districts). The major tribes in the state are; Okuns, Ebiras and

Igalas. It shares common boundaries with Niger, Kwara, Nassarawa and The Federal Capital

Territory to the north. To the East, the state is bounded by Benue and Enugu states, to the south by Enugu and Anambra States, and to the west by Ondo, Ekiti and Edo states. The administrative head is located at Lokoja where the two major rivers in Nigeria meet. The western senatorial

37

38 district is comprised of seven Local Government Areas; Kabba/Bunnu, Ijumu, Yagba West,

Yagba East, Mopamuro, Kogi and Lokoja Local Government Areas. These are the Yoruba speaking part of the state except Kogi and Lokoja Local Government Areas. The major industry of the people of the western senatorial district is Education.

Population of the Study

The population of this study comprised of all the forty one thousand three hundred and twenty one (41,321) SSII students in Kogi State who sat for the State-Based Transition examination (SBTE) in 2012/2013 academic session, Centralized Mock Examination in February

2014 and sat for the WAEC SSCE Examination in May/June 2014 (Kogi State Ministry of

Education, 2008).

Sample and Sampling Technique

A sample of 520 SS II students that sat for SBTE and Centralized Mock Examination in

2012/2013 session and had WAEC results in May/June 2014 was used. A multi-stage sampling procedure was used for the study. In the first stage, one senatorial district out of the three senatorial districts was selected using purposive sampling. At the second stage, two local government areas (Kabba/Bunnu and Mopamuro) were selected using purposive sampling technique. Kabba is an ancient town that has all the tribes in the state represented as well as one of the major commercial centres in the state and Mopamuro local government equally has all the tribes in the state dwelling there because of the natural endowment of land that supports agriculture. The familiarity of the place to the researcher that can ease the collection of data for the study was also considered. At the third stage, simple random sampling was used to select six out of sixteen secondary schools from Kabba/Bunnu Local Government Area and four out of ten secondary schools from Mopamuro Local Government Area. At the fourth stage, proportionate

39 sampling technique was used to sample 312 students from Kabba/Bunu and 208 students from

Mopamuro local government areas respectively as the population in the two local government areas are not the same. All the candidates in each of the schools selected who offered

Mathematics, English Language, Physics, Chemistry and Biology were used for the study. See appendices A and B for the list of the secondary schools in both Mopamuro and Kabba/Bunu local government areas.

Instrument for Data Collection

The ‘Students Grade Chart’ (SGC) was used to collect the score of the students in the

Transition, Mock Examinations and West African Examination Council SSCE in Mathematics

English, Language, Physics, Chemistry and Biology. The chart makes the provision for the name of the school and the name of the students. This is to enable the researcher to enter the name of the students that have their results in Transition, Mock and SSCE Examinations in the subjects of interest, that is, Mathematics, English Language, Physics, Chemistry and Biology. The SGC also contained Students’ registration number which will also indicate the centre number of the students. Grades in Transition, Mock Examinations and SSCE were also obtained with it. (See appendix C).

Validation of Instrument

The instrument was face validated by two experts in measurement and evaluation from

Science Education Department in University of Nigeria, Nsukka.

40

Reliability of Instrument

The instrument was standardized pro-forma and it was adapted there was no need to test for its reliability.

Method of Data Collection

The researcher sought for the cooperation of the principals of the selected schools to get the results of the students in the Transition Examination for the year 2013, Mock Examination in

2014 and WAEC SSCE for May/June 2014. The researcher with the help of a research assistant that had been trained on how to collect the data collected these results.

Method of Data Analysis

Data collected was analyzed using correlation coefficient to answer the research questions and Regression Analysis was used to answer the research questions and ANOVA utilised to test the hypotheses at .05 level of significance. Figures were assigned to each of the

STANINE grades A1, B2, B3, C4, C5, C6, D7, E8, F9 in the reversed order thus 9, 8, 7, 6, 5, 4, 3, 2,

1 respectively. A stanine (“standard nine”) score is a way to scale scores on a nine-point scale. It can be used to convert any test score to a single-digit score. Like z-scores and t-scores, stanines are a way to assign a number to a member of a group, relative to all members in that group.

However, while z-scores and t-scores can be expressed with decimals like 1.2 or 3.25, stanines are always positive whole numbers from 0 to 9 (Chevette, 2015).

When the probability value (P- value) is less than or equal to the alpha level of 0.05, there is a significant difference, therefore the null hypothesis will be rejected, but if otherwise upheld.

41

CHAPTER FOUR

This chapter deals with the presentation of results according to the research questions and hypotheses stated.

Research Question 1: To what extent do Transition and Mock Examinations in Mathematics

predict the achievement of students in WAEC SSCE Mathematics?

Table 1: Multiple regression analysis of predictor variables with the criterion variable SSCE Variables Model B Std error beta t Sig. t R R2 F

Maths Constant -0.173 0.154 -1.120 0.263

Trans 0.365 0.025 0.467 14.541 0.000 0.761 0.580 356.730

Mock 0.557 0.042 0.424 13.204 0.000

English Constant 0.875 0.252 3.469 0.001

Trans -0.114 0.033 -0.114 -3.397 0.001 0.646 0.417 184.853

Mock 0.772 0.041 0.633 18.834 0.000

Physics Constant 0.246 0.157 1.561 0.119

Trans 0.240 0.029 0.300 8.335 0.000 0.616 0.379 157.791

Mock 0.538 0.042 0.464 12.898 0.000

Chemistry Constant 0.438 0.124 3.537 0.000

Trans 0.083 0.027 0.107 3.052 0.002 0.704 0.494 253.992

Mock 0.657 0.036 0.648 18.411 0.000

Biology Constant 0.896 0.176 5.080 0.000

Trans 0.119 0.029 0.153 4.018 0.000 0.611 0.374 154.330

Mock 0.679 0.049 0.533 13.969 0.000

Combined Constant -2.519 0.570 -4.420 0.000

Trans 0.209 0.031 0.208 6.775 0.000 0.848 0.719 663.045

Mock 0.809 0.036 0.698 22.712 0.000

41

42

Table shows the multiple regression analysis for the different subjects. For Mathematics, R was

.70 and a corresponding R2 of .58. Thus 58% of the variance in performance in WAEC SSCE was accounted for by the variation in performance of Transition and Mock examinations in

Mathematics.

The regression equation for Transition and Mock Mathematics scores derived from table 1 was as follows; MATSSCE () = -0.17 + .37MATTRANS + .56MATMOCK

Hypothesis 1: Performace in Transition and Mock examinations do not significantly predict

performance in WAEC SSCE Mathematics.

Table 2: ANOVA of relationship between Maths TRANS, MOCK and SSCE Maths Model Sum of Squars df Mean Square F Sig. Regression 690.00 2 345.00 356.73 .000 Residual 500.00 517 .97

Total 1190.00 519  = 0.05

Hypothesis 1 is answered with the ANOVA table presented in Table 2. From the table

F2,517 = 356.73, P< .000. Since the obtained probability value (.00) associated with the computed

F – value was less than 0.05 level of significance, the null hypothesis was rejected. It implied that the predictive validity of Transition and Mock Examinations was statistically significant.

Therefore performance in Transition and Mock Examinations significantly predict SSCE

Mathematics.

Research Question 2: To what extent do Transition and Mock Examinations in English

Language predict the achievement of students in SSCE English Language?

The coefficient of determination of 0.42 shown in Table 1 means that 42% of students’ achievement in WAEC SSCE English Language is accounted for by performances in Transition

43 and Mock Examinations is accounted for students’ achievement in English Language SSCE.

This is an indication that 58% of variation in achievement of students in SSCE is attributed to other factors other than Transition and Mock Examinations in English Language. This result shows that Transition and Mock Examinations jointly are good predictors of students’ achievement in SSCE English Language.

The regression equation for Transition and Mock English Language scores derived from table 1 was as follows: ENGSSCE () = 0.88 – 0.11ENGTRANS + 0.77ENGMOCK

Hypothesis 2: Performace in Transition and Mock examinations do not significantly predict

performance in WAEC SSCE English Language.

Table 4: ANOVA of relationship between English LanguageTRANS, MOCK and SSCE English Language Model Sum of Squares df Mean Square F Sig. Regression 535.35 2 267.67 184.85 .000 Residual 748.63 517 1.45 Total 1283.98 519  = 0.05

Hypothesis 2 is answered with the ANOVA table presented in table 3. From it,

F2,517=184.85, P< .000. Since the obtained probability value (.000) associated with the computed

F – value was less than 0.05 level of significance, the null hypothesis was rejected. It implied that the predictive validity of Transition and Mock Examinations was statistically significant.

Therefore, performances in Transition and Mock Examinations significantly predict SSCE

English Language.

Research Question 3: To what extent do Transition and Mock Examinations in Physics predict

the achievement of students in SSCE Physics?

Result in Table 1 shows a coefficient of determination of 0.38. The coefficient of determination

(0.38), in Table 1 indicates that students’ achievement in Physics Transition and Mock

44

Examinations is accounts for 38% of students’ achievement in Physics SSCE. This is an indication that 62% of variation in achievement of students in SSCE Physics is attributed to other factors other than Transition and Mock Examinations in Physics. This result shows that

Transition and Mock Examinations jointly are good predictors of students’ achievement in SSCE

Physics.

The regression equation for Transition and Mock Physics scores derived from table 1 was as follows: PHYSSCE() = 0.25 + 0.24PHYTRANS + 0.54PHYMOCK

Hypothesis 3: Performace in Transition and Mock examinations do not significantly predict

performance in WAEC SSCE Physics.

Table 4: ANOVA of relationship between Physics TRANS, MOCK and SSCE Physics Model Sum of Squares df Mean Square F Sig. Regression 364.33 2 182.17 157.79 .000 Residual 596.86 517 1.15 Total 961.19 519  = 0.05

Hypothesis 3 is answered with the ANOVA table presented in table 4. From the table,

F2,517=157.79, P< .000. Since the obtained probability value (.000) associated with the computed

F – value (157.79) was less than 0.05 level of significance, the null hypothesis was rejected. It implied that the predictive validity of Transition and Mock Examinations was statistically significant. Therefore, performance in Transition and Mock Examinations significantly predict performance in SSCE Physics.

45

Research Question 4: To what extent do Transition and Mock Examinations in Chemistry

predict the achievement of students in SSCE Chemistry?

Table 1 shows that the coefficient of determination is 0.50. This means that students’ achievement in Chemistry Transition and Mock Examinations accounted for 50% of students’ achievement in Chemistry SSCE. Thus 50% of variation in achievement of students in SSCE

Chemistry is attributed to other factors other than Transition and Mock Examinations in

Chemistry. This result shows that jointly, Transition and Mock Examinations are good predictors of students’ achievement in SSCE Chemistry.

The regression equation for Transition and Mock Mathematics scores derived from table

1 was as follows: CHESSCE () = 0.44 + 0.08CHETRANS + 0.66CHEMOCK

Hypothesis 4: Performace in Transition and Mock examinations do not significantly predict

performance in WAEC SSCE Chemistry.

Table 5: ANOVA of relationship between Chemistry TRANS, MOCK and SSCE Chemistry Model Sum of Squares df Mean Square F Sig. Regression 433.23 2 216.62 253.99 .000 Residual 440.92 517 .85 Total 874.15 519  = 0.05

Hypothesis 4 is answered with the ANOVA table presented in table 5 where

F2,517=353.99, P< .000. Since the obtained probability value (.000) associated with the computed

F – value was less than 0.05 level of significance, the null hypothesis was rejected. It implied that the predictive validity of Transition and Mock Examinations was statistically significant.

Therefore, Transition, Mock Examinations are significant predictors of SSCE Chemistry.

46

Research Question 5: To what extent do Transition and Mock Examinations in Biology predict the achievement of students in SSCE Biology?

Table 1 shows that the coefficient of determination was 0.37. This means that students’ achievement in Biology Transition and Mock Examinations accounted for 37% of students’ achievement in Biology SSCE. This is an indication that 63% of variation in achievement of students in SSCE Biology is attributed to other factors other than Transition and Mock

Examinations in Biology. Thus jointly, Transition and Mock Examinations are good predictors of students’ achievement in SSCE Biology.

The regression equation for Transition and Mock Mathematics scores derived from table 2 was as follows: BIOSSCE() = 0.90 + 0.12BIOTRANS + 0.68BIOMOCK

Hypothesis 5: Performace in Transition and Mock examinations do not significantly predict

performance in WAEC SSCE Biology.

Table 6: ANOVA of relationship between Biology TRANS, MOCK and SSCE Biology Model Sum of Squares df Mean Square F Sig. Regression 460.71 2 230.36 154.33 .000 Residual 771.68 517 1.49 Total 1232.39 519  = 0.05

Hypothesis 5 is answered with the ANOVA table presented in table 6, where

F2,517=154.33, P< .000. Since the obtained probability value associated with the computed F – value was less than 0.05 level of significance, the null hypothesis was rejected. It implied that the predictive validity of Transition and Mock Examinations was statistically significance.

Therefore, performance in Transition, Mock Examinations are jointly significant predictors of

SSCE Biology.

47

Research Question 6: Which of the Examinations, Transition or Mock, best predict the

students’ achievement in SSCE?

Table 1 shows that the coefficient of determination of 0.72, indicating students’ achievement in

Transition and Mock Examinations accounted for 72% of students’ achievement in SSCE. Thus

28% of variation in achievement of students in SSCE is attributed to other factors other than

Transition and Mock Examinations. This result shows that Transition and Mock Examinations are jointly good predictors of students’ achievement in SSCE.

The regression equation for Transition and Mock scores derived from table 1 was as follows: SSCE () = -2.52 + 0.21TRANS + 0.81MOCK. From this equation, it is noted that

Mock examination is a better predictor in overall SSCE examinations.

48

Hypothesis 6: There is no significance difference in the predictive validity of Transition and

Mock Examinations.

Table 7: ANOVA of relationship between TRANS, MOCK and SSCE Model Sum of Squares df Mean Square F Sig. Regression 11799.96 2 5899.98 663.05 .000 Residual 4600.42 517 8.90 Total 16400.38 519  = 0.05

Hypothesis 6 is answered with the ANOVA table presented in table 7. From it

F2,517=663.05, P< .000. Since the obtained probability value associated with the computed F – value was less than 0.05 level of significance, the null hypothesis was rejected. It implied that the predictive validity of Transition and Mock Examinations was statistically significant. Therefore, performances in Transition and Mock Examinations are significant predictors of SSCE.

Summary of the Results

The outcome of the data analysis provided the following results;

1) Transition and Mock Examinations in Mathematics are good predictors of SSCE

Mathematics. Transition and Mock scores accounted for 58% of the variance of students’

achievement in SSCE Mathematics.

2) There was a good prediction of SSCE English Language by Transition and Mock

Examinations. Transition and Mock Examinations in English Language predicted 42% of

the total variance of the students’ achievement in SSCE English Language.

3) Transition and Mock Examinations in Physics are good predictors of SSCE Physics.

Transition and Mock scores accounted for 38% of the variance of students’ achievement

in SSCE Physics.

49

4) The prediction of Transition and Mock Examinations in Chemistry was good as variance

of students’ achievement in Transition and Mock Chemistry accounted for 50% of

students’ achievement in SSCE Chemistry.

5) Transition and Mock Examinations in Biology is a good predictor of SSCE Biology.

Transition and Mock scores accounted for 37% of the variance of students’ achievement

in SSCE Biology.

6) The combination of Transition and Mock Examinations scores jointly accounted for 72%

of the variance of students’ achievement in SSCE.

50

CHAPTER FIVE

DISCUSSION OF RESULTS, RECOMMENDATIONS AND SUMMARY.

The chapter presents the discussion of the results within the framework of this research, to draw inferences and make conclusions based on the results. It explored the educational implication of the research results, identified the limitations placed upon the researcher, made recommendation and suggestions for further research as a summary of the study.

Discussion of the Results

The findings of the study are discussed under the following sub-headings;

i) The predictive validity of Transition and Mock Examinations in Mathematics on

students’ achievement in SSCE Mathematics.

ii) The predictive validity of Transition and Mock Examinations in English Language on

students’ achievement in SSCE English Language.

iii) The predictive validity of Transition and Mock Examinations in Physics on students’

achievement in SSCE Physics.

iv) The predictive validity of Transition and Mock Examinations in Chemistry on students’

achievement in SSCE Chemistry.

v) The predictive validity of Transition and Mock Examinations in Biology on students’

achievement in SSCE Biology.

vi) The predictive validity of the combination of Transition and Mock Examinations scores

on the weak achievement of students in in SSCE.

50

51

The predictive validity of Transition and Mock Examinations in Mathematics on students’ achievement in SSCE Mathematics.

Transition and Mock scores explained 58% of the variance of SSCE Mathematics. This finding tends to support the report of Kolawole and Ala (2013) on the study of the predictive validity of continuous assessment scores on students’ performance in Mathematics who observed that the Aggregate Continuous Assessment scores (AACA) had an influence of 13.1% on the Final

Scores.

It is also in line with the study conducted by Omirin and Ale (2008) on the predictive validity of English and Mathematics, which indicated that Mock result in Mathematics has 23% predictive value of WASSCE Mathematics. However, there is a deviation from the work of

Adeyemi (2008) which revealed that JSC Mathematics could not significantly predict SSCE

Mathematics. Also Ale and Omodara (2015) on the predictive validity of unified Examination for Academic performance in senior secondary school certificate examination in Ekiti State showed that there was no significant relationship (r=0.04, P>0.05) between unified examination and SSCE Mathematics.

The predictive validity of Transition and Mock Examinations in English Language on students’ achievement in SSCE English Language.

From the result, 42% of the total variance of SSCE was accounted for by the achievement in Transition and Mock Examination scores. F2,517=184.85, P< .000 indicated that there was a significant relationship between Transition, Mock Examinations and SSCE English

Language. This means that Transition and Mock English Language positively predict SSCE

English Language. This finding tends to support the study conducted by Omirin and Ale (2008) who investigated how Mathematics and English Language Mock scores predict students’ achievement in WASCE in Ekiti state.

52

The result showed that Mock English Language predicted the achievement of students in

WASCE English Language. The result revealed that the prediction by mock of WASCE English

Language was 25%. Also Adeyemi (2008) carried out a research on predicting students’ performance in senior secondary certificate Examinations in Ondo State revealed that JSC

English Language predicted SSCE English Language. It is also in line with the research conducted by Faleye and Afolabi (2016) who revealed that JSC significantly predicted English

Language SSCE as 59% of the students who obtained ‘As’ in JSCE also obtained ‘As’ in the

SSCE, 70.8% of students who obtained ‘Cs’ in JSCE also obtained ‘Cs’ or better in SSCE and only 59.4% of students who obtained ‘F’ grade in JSCE did likewise in the corresponding or equivalent SSCE subjects.

The predictive validity of Transition and Mock Examinations in Physics on students’ achievement in SSCE Physics.

Transition and Mock scores explained 38% of the variance of SSCE. This finding is not in aggrement with the study conducted by Innocent, Gladys and Onyiyechi (2015), Adesoji

(2008) and Adeyemi (2008). For instance Innocent, Gladys and Onyiyechi (2015) investigated students’ Mathematics and English Language Mock achievement as predictors in school certificate performance in Physics. It was found that there was a significant relationship between the performance of student in physics (SSCE) and their performances in Mock Examination on

Mathematics.

Adesoji stated that the two independent variables accounted for 83% of the variance of the dependent variable (R2 = 0.8301) which was statistically significant. Furthermore, the

Mathematics result could be used to predict the students’ performance in Physics but English

Language cannot. The present study does not combine the achievements of students in

53

Mathematics and English Language to predict the students’ achievement in SSCE. The study conducted by Adeyemi (2008) revealed that JSCE Integrated Science could not significantly predict the students’ performance in SSCE Physics.

The predictive validity of Transition and Mock Examinations in Chemistry on students’ achievement in SSCE Chemistry.

Transition and Mock Chemistry accounted for 50% of total variation in SSCE Chemistry.

This study supports the finding of Adeyemi (2008) that Integrated Science significantly predicted

SSCE Chemistry. There is a deviation from the study conducted by Achor, Kurumeh and Orokpo

(2012) on gender dimension as predictor of students’ performance in Mock-SSCE practical and theory Chemistry Examinations in some secondary schools in Nigeria. The study revealed that

Mock-SSCE practical and theory Chemistry did not significantly predict the performances of male and female students. Also in the work of Edokpayi and Suleiman (2011), there exist a poor significant relationship between students’ Integrated Science achievement in JSCE and their later achievement in Chemistry SSCE.

The predictive validity of Transition and Mock Examinations in Biology on students’ achievement in SSCE Biology.

Transition and Mock examinations accounted for 37% of the variation in SSCE Biology.

The finding is in support of the study conducted by Adeyemi (2008) on predicting students’ performance in senior secondary certificate Examinations in Ondo State, Nigeria. The conclusion was that Integrated Science predicted SSCE Biology. It is also in agreement with the research work conducted by Ale and Omodara (2015) on the predictive validity of unified Examination for Academic performance in senior secondary school certificate examination in Ekiti State which revealed that the correlation between SSII Unified Examination and SSCE was significant for Biology.

54

The predictive validity of the combination of Transition and Mock Examinations scores on the achievement of students in in SSCE.

The results equally revealed that Transition and Mock accounted for 72% of the total variation in SSCE. This finding is in agreement with the study carried out by Adeyemi (2008) that revealed that there is a significant relationship between performances in JSC and SSC

Examinations. The study is not in agreement with Faleye and Afolabi (2016) who revealed that

Osun State JSCE is a poor predictor of students’ achievement in SSCE.

Conclussion

This study sought to investigate the predictive validity of the State Based Transtion and

Mock Examinations scores in Kogi State. The focus was to determine the extent to which

Transition and Mock Examinations predict the students’ achievement in SSCE in science subjects including English Language and Mathematics and to determine which of the two examinations best predicts students’ achievement in SSCE.

Based on the findings, it could be concluded that students’ performance in Mathematics

Transition, Mock Examinations predicted performance in Mathematics SSCE. The result revealed that English Language Transition and Mock Examinations significantly predict the students’ achievement in English Language SSCE. Similarly, the finding of this study showed that Physics Transition and Mock Examinations significantly predict the achievement of students in Physics SSCE.

Furthermore, the result of this study concluded that a significant relationship exists among Chemistry Transition, Mock Examinations and achievement of students in Physics SSCE.

Also, it was revealed that Biology Transition and Mock Examinations significantly predicted the students’ achievement in Biology SSCE. Additionally, it was found that both Transition and

Mock Examinations predict students’ achievement in SSCE, but that the predictive value of

55

Mock is higher than that of Transition Examination. In essence, the result revealed that the combination of Transition and Mock Examinations was a better option in preparing students for external examinations

Educational Implications of the study

The result of this study have implications for Kogi State Government, the school administrators, teachers, parents/guardian, students and the general society.

The result of the data analysis of this study revealed the followung facts about Transition and Mock Examinations in Kogi State: performance in all the five subjects used for the purpose of this study significantly predicted the achievement of students in the SSCE. Therefore the

Examinations can be improved upon if are continued to be used to preparing students for this external examination.

Furthermore, the results of the data analysis from these study have provided evidence that the combination of Transition and Mock Examinations significantly predicted the students achievement in SSCE. It also revealed that the Transition correlate highly with the Mock examinations which in turn predicted the students’ achievement in SSCE more than the

Transition Examination. Because the two examinations complement each other, the government should continue to bear the pain of conducting the examinations to stand as a preparative ground for the students.

Additionally, the finding of the rsearch have imperatives for quality assurance and quality control in the state educational system. Since the quality assurance in the state educational system starts with the procedures, rules and polices that govern the selection of students, transition examination seem to be a practical attempt to ensure that students are prepared and selected for the external examinations. The fact that the Transition and Mock Examinations are

56 significant predictors suggests that the effort of the government is not in vain, therefore, it could be improved upon and be used to prepare the students for SSCE.

For the parents/guardians, teachers and students, the result of this study suggests that the period of mass promotion is over and that students and parents should accept their fate if they cannot meet up with the required standard set by the state government, which is passing at five credits level including English Language and Mathematics before one can be allowed to move to the class at which the external examination is taken.

Limitations of the Study

The following limitations were observed that may have influenced the outcome of the results of the study.

1) Data included in the study were obtained from only accessible students’ records in the two

local government areas and sample was drawn from only ten secondary schools in five

subjects.

2) It was observed that there were so many bureaucratic bottle necks which hindered the

accessibility to the necessary documents from these secondary schools. Examination and

records officers of various schools were quite reluctant to release the necessary documents.

Recommendations

Based on the conclusion and educational implications of the study, the following

recommendations are made:

1) The government should take appropriate measure to ensure that Transition and Mock

Examination scores obtained by the students are congruent with the scores in the SSCE.

2) Kogi state ministry of Education should ensure that the selection of students for SSCE is

maintained while the Transition Examination that is being used is sustained and the

57

repackaged Mock Examination is improved upon for better performance in these external

examinations.

3) In the face of incessant mass failure in these external examinations, the examination bodies

should liase with the state governments to see how students can be prepared for these

examinations.

4) Parents, guardian and students should bear the extra-cost and expenses incured in

participating in the Transition and Mock Examinations since it provides a viable option for

obtaining evidence about the credibility of the candidates’ performance in the external

examinations.

Suggestion for Further Studies

Based on this study, the researcher wishes to make the following suggestions:

1) The researcher used an already gotten result for the study. Other researchers should use the

question papers of the Examinations conducted by Transition, Mock and SSCE to elicit

responses from students in Kogi state.

2) Only ten secondary schools from two local government areas in the Western senatorial

district of Kogi State were used for the study, broader scope could be used by other

researchers in Kogi state. The study could also be replicated in other states that condut both

Transition and Mock Examinations to prepare students for External Examinations.

3) Five subjects were used to conduct the study and in only one year. More subjects could be

used in more than a year thereby giving room for comparison.

4) The only criterion variable used in this study was SSCE. Other researchers could attempt

using other external examinations like the ones conducted by NECO and NABTEB.

58

Comparison can be made between the predictive validity of Transition and Mock

Examinations on two external examinations.

5) It was discovered that both Transition and Mock Examinations accounted for a proportion of

variation in SSCE. It implies that there are other factors that are not considered. Other

researchers can investigate into how moderator variables combined with Transition and

Mock Examinations influence students’ achievement in SSCE other than Transition and

Mock Examinations alone.

Summary of the Study

The general purpose of the study was to deternine the predictive validity of Transition and

Mock Examinations in Kogi State and to determine which of the examinations predicts more the achivement of students in SSCE. Over the years, mock examinations have been used to prepare students for external examinations. However, in year 2008 Kogi State government introduced

Transition and repackaged Mock Examination for implementation in the year 2012 as a result of poor performance of students in the external examinations.

The study was guided by the following research questions:

i) To what extent do Transition and Mock Examinations in Mathematics predict the

acheivement of students in SSCE Mathematics?

ii) To what extent do Transition and Mock Examinations in English Language predict the

acheivement of students in the SSCE English Language?

iii) To what extent do Transition and Mock Examinations in Physics predict the achievement

of students in SSCE Physics?

iv) To what extent do Transition and Mock Examinations in Chemistry predict the

achievement of students in SSCE Chemistry?

59

v) To what extent do Transition and Mock Examinations in Biology predict the achievement

of students in SSCE Biology?

vi) Which of the Examinations, Transition or Mock, best predict the students’ achievement

in SSCE?

The literature review consisted of the cocepts of Examination, concept of validity, concept of achievement in some school subjects, concept and the meaning of correlation and regression. The theoretical framework discusses classical test theory and theory of regression and empirically related studies on predictive validity were also reviewed.

This study adopted correlational survey design. The study was carried out in Kogi State.

The sample consisted of 520 students drawn through multi-stage sampling technique. Data was obtained from the sampled schools with the help of research assistance using a pro-forma called

Students’ Grade Chart (SGC). Data was analyzed using correlation and multiple regression analysis.

The findings of the study which were derived from the data analysis are as follows;

1) Transition and Mock Examinations in Mathematics are good predictors of SSCE

Mathematics.

2) There was a good prediction of SSCE English Language by Transition and Mock

Examinations. Transition and Mock Examinations in English Language predicted 42% of the

total variance of the students’ achievement in SSCE English Language.

3) Transition and Mock Examinations in Physics is a good predictor of SSCE Physics.

Transition and Mock scores accounted for 38% of the variance of students’ achievement in

SSCE Physics.

60

4) The prediction of Transition and Mock Examinations in Chemistry was good. 50% of the

total variance of students’ achievement in Transition and Mock Chemistry accounted for

students’ achievement in SSCE Chemistry.

5) Transition and Mock Examinations in Biology is a good predictor of SSCEBiology.

Transition and Mock scores accounted for 37% of the variance of students’ achievement in

SSCE Biology.

6) The combination of Transition and Mock Examinations scores jointly accounted for 72% of

the variance of students’ achievement in SSCE.

The above findings were discussed and conclusion drawn. Some of the findings were found to be in agreement with previous research findings, while some were not in agreement. The findings have implications for the government, school administrators, parents/guardian, teachers and students. Recommendations were made based on the findings and implication of the study.

Among other things, it was recommended that Kogi state ministry of Education should ensure that the selection of students for SSCE is maintained while the Transition Examination that is being used is sustained and the repackaged Mock Examination is improved upon for better performance in these external examinations. The limitations encountered in the study were discussed and suggestions for further research were made.

61

REFERENCES

Abdullahi, O. E. (2009). Examination malpractices at the post-primary school level. Afrcan Journal of Educational Studies, 6(1), 34-37.

Achor, E. E., Kurumeh, S. M., & Orokpo, C. A. (2012). Gender dimension in predictors of students' performance in mock-SSCE practical and theory chemistry examinations in some secondary schools in Nigeria. Scientific and Academic journal, 2(2), 16-22.

Adaraniwon, A. O., & Adetunji, A. A. (2011). The effect of weak foundation in the learning of mathematics. A paper presented at the 48th National Annual Conference of the Mathematical Association of Nigeria(MAN) held at the Federal Polytechnic, Ado-Ekiti. Ekiti State.

Adesoji, F. A. (2008). English language and mathematics mock results as predictors of performance in SSCE physics. Journal of Social Science, 17(2), 159-161.

Adeyegbe, S. O. (2004). Research into STM curriculum and school examinations in Nigeria. The state of the Art, proceedings of the STAN Annual conference.

Adeyemi, T. O. (2008). Predicting Students' Performance in Senior Secondary Certificate Examinations from Performance in Junior Secondary Certificate Examinations in Ondo State, Nigeria. Middle-East Journal of Scientific Research, 3(2), 73-81.

Adonu, I. I. (2014). Psychometric Analysis of WAEC and NECO practical physics test using partial credit model. (Unpublished Ph.D Thesis), UNN.

Agah, J. J. (2013). Relative Efficiency of Test Scores Equating Methods in the Comparison of Students Continuous Assessment Measure. (Unpublished Ph.D Thesis), UNN.

Agbi, T. O. (2006). The effect of different instructional methods on the achievement and interest of secondary school chemistry students. (Unpublished Ph.D thesis), University of Nigeria, Nsukka.

Agomuoh, P. C. (2010). Effect of prio kwoledge exploration discussion dissatisfaction with prio kwoledge and application (PEDDA) and the learning cycle constructivist instructional models on students' conceptual change and retention in Physics. (Unpublished Ph.D thesis), University of Nigeria, Nsukka.

Ale, S. O., & Adetula, L. O. (2010). The national mathematical centre and the mathematics improvement project in nation building. Journal of Mathematical Science Education, 1(1), 1-19.

62

Ale, V. M., & Omodara, M. F. (2015). Predictive validity of Unified Examination for Secondary School Certificate Examination in Ekiti State. Palgo Journal of Education Research, 3(1), 140-145.

Aniaku, O. L. (2012). Effects of guided and unguided inquiry teaching methods on secondary school students' achievement and interest in Biology in Enugu State. (Unpubllished M.Ed project), UNN.

Asikhia, O. A. (2010). Students' and Teachers' perception of the causes of poor academic performance in Ogun state secondary schools (Nigeria): Implication for counseling for National Development. European Journal of Social Scieces, 13(2), 229.

Awodun, A. O., Olusola, O. O., & Oyeniyi, A. D. (2013). Impsct of continuous Assessment, Mock results snd gender on Physics students' achievement in senior school certificate examination in Ekiti State, Nigeria. IJERT, 2(5).

Barry, H. C., & Brooke, R. L. (2004). Essentials of statistics for the social and behavioral sciences. Canada: John Wiley & Sons Inc.

Boeree, G. (2004). Evolution of english. Shippensburg: University of Shippensburg.

Chevette, A. (2015). Basic Statistics in Education. Retrieved from http://www.statisticshowto.com/stanine/.

Edokpayi, J. N., & Suleiman, M. A. (2011). Students' Integrated Science achievement as predictors of later achievement in Chemistry: A case study among selected secondary schools in Zaria metropolis. Archives of Applied Science Research, 3(4), 527-535.

Ejigbo, M. A. (2014). Comparison of Senior School Certificate Examinations by West African Examination and National Examination Councils and performance of students in Kogi State 2008-2010. (Unpublished Ph.D Thesis), UNN.

Emaikwu, S. O. (2011). Fundamentals of research methods and statistics. Markurdi: Selfers Academic Press Ltd.

Emmanuel, E. A., & Joel, O. U. (2014). An Examination of the facilitative effect of Computer Assisted Instruction (CAI) in students' achievement in chemical reaction and equilibrium. Journal of Education, 4(1), 7-11.

Epunam, A. D. (1999). Influence of school environmental variables on academic performace as perceived by students. (Unpublished M.Ed Thesis), University of Nigeria, Nsukka.

Eze, D. (2003). Validity and reliability of test. In B. G. Nworgu, Educational measurement and evaluation: theory and practice. Nsukka: University Press Publishers.

63

Eze, D. N. (2011). Writing research proposal and report without tears. Nsukka: Ephrata Publishers.

Ezeudu, S. A. (1997). Educational measurement and evaluation for colleges and universities. Onitsha: Cape Publishers Int'l Ltd.

Ezeudu, S. A. (2005). Research methodology. Unpublished seminar paper presented at the 1st research semina, Kogi State College of Education. Ankpa.

Fakeye, D. O. (2006). ICT-Assisted instruction and students' vocabulary achievement in selected secondary schools in Ibadan. Journal of Humanities in Education, 1(1), 14-17.

Faleye, B. A., & Afolabi, E. R. (2016). The predictive validity of Osun State Junior Secondary Certificate Examination. Electronic Journal of Research in Educational Psychology, No 3-5(1), 131-144.

Fan, X. (1998). Item Response Theory and Classical Test Theory: An empirical comparison of their item/person characteristics. Educational and Psychological Measurement, 3, 357- 382.

Field, A. (2005). Discovering Statistics using SPSS (3rd Ed). London: Sage Publication.

Frank, B. B. (2001). The basics of item response theory. USA: Clearinghouse on Assessment and Evaluation.

FRN. (2013). National Policy on Education. Lagos: NERD Press.

Gall, M. D., Gall, J. P., & Borg, W. R. (2007). Educational research:an introduction. USA: Pearson Education Inc.

Garson, G. D. (2008). Validity. Retrieved from http://www2.class.ncsu.edu/garson/validity-html.

Ghiselli, E. E. (1981). Measurement Theory for the Behavioral sciences. San Francisco: W.H Freeman.

Hambleton, R. K., & Jones, R. W. (1993). Comparison of Classical Test Theory and Item Response Theory and their Application to Test Development. Educational measurement, issues and practice, 12(3), 38-47.

Harbor-Peters, A. V. (1999). Noteworthy prints on measurement and evaluation. Enugu: Snaap Press.

Hinkle, D. E., Wiersa, W., & Jurs, S. G. (1988). Applied Statistics for the Behavioural Sciences (2nd Ed). Boston: Houghton, Mifflin.

64

Hogan, T. P., & Angello, J. (2004). An emperical study of reporting practices concerning measurement validity. Educational and psychological measurement, 64(1), 802-812.

Innocent, F. O., Gladys, C. O., & Onyiyechi, O. O. (2015). Students' Mathematics and English Language Mock Examination as predictors to school certificate performance in Physics. British Journal of Education, 3(8), 47-54.

Jeffrey, M. S. (2001). Galton, Pearson and the Peas: A brief history of linea regression for statistics instructions. Journal of statistics Education, 9(3).

Kogi State Ministry of Education. (2008).

Kolawole, C. O., & Dele, A. (2002). An examination of the national policy of language education in Nigeria and its implientation for the teaching and learning of the English Language. Ibadan journal of Educational studies, 2(1), 12-20.

Kolawole, E. B., & Ala, E. A. (2013). Predictive validity of continuous assessment scores on students' performance in mathematics in some selected states in the south-west Nigeria. Journal of Educational Research and Review, 1(4), 41-48.

Kolawole, E. B., & Udoh, D. O. (2012). Head circumference as a predictor of aptitude performance in mathematics. (Unpublished Thesis), Ekiti State University, Ado Ekiti.

Lassa, P. N., & Paling, D. (2003). Teaching mathematics in Nigerian primary schools. Ibadan: University Press Ltd.

Leonard, R. (2004). Mock examinations assessment in education. Journal of Education, 5(2), 55- 56.

Macmillan, M. J. (2012). School location versus academic achievement in Physics: Does Computer-Assisted Instruction(CAI) has any effect? Journal of Educational and Social Research, 2(8), 64-67.

Maduabum, M. A. (1998). Motivational problems that hinder teaching accountability, in Chidolue, M.E and Anidu, C.C Effective teaching: The Nigeria perspective. Awka: Nnamdi Azikiwe universtity press.

Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessment. Journal of Educational Research, 23(2), 18-23.

Moss, P. A. (1994). Can there be validity without reliability? Journal of Educational Research, 23(2), 5-12.

65

Nweze, B. N. (2013). University matriculation entrance examination scores in chemistry as predictors of achievement in chemistry-based courses in public universties in Enugu State. (Unpublished Ph.D Thesis), University of Nigeria,Nsukka.

Nworgu, B. G. (2006). Educational research: basic issues & methodology. Nsukka: University Trust Publishers.

Nworgu, B. G. (2015). Educational research: basic issues & methodology. Nsukka: University Trust Publishers.

Obije, I. B. (1995). Entry qualification as corrolate of students' performance in electronics in Delta State technical colleges. (Unpublished M.Ed Thesis), University of Nigeria, Nsukka.

Obinne, A. D. (2011). A psychometric analysis of two major examinations in Nigeria: Standard error of measurement. International Journal of Educational Science, 3(2), 137-144.

Obioma, G., & Salau, M. (2007). The pedictive validity of public examinations: A case study of Nigeria. 33 Annual conference of International Association for Educational Assessment (IAEA), (pp. 16-21). Baku.

Odeleye, O. A., Olusola, J. S., & Awodun, A. O. (2010). Enhancing the integrity of Educational Evaluation; curbing the menace of test and examination malpractices in Nigeria. Journal of Management Skills and Techniques, 2(3), 102-109.

Ogbu, J. E. (2008). Effects of interaction patterns on students achievement and interest in basic electricity. (Unpublished Ph.D Thesis), Faculty of Education, University of Nigeria, Nsukka.

Okeagu, R. U. (2013). Effects of problem-solving teaching method on students' achievement in Mathematics. (Unpublished M.Ed project), UNN.

Okoli, J. N. (2006). Effect of investigative laboratory approach and expository methods on acquisition of science process skills by Biology students of different levels of scientific literacy. Journal of the science Teachers' Association of Nigeria, 41(1&2), 79-88.

Okoro, A. U. (2012). Effect of interaction patterns on achievement and interest in Biology among secondary school students in Enugu State, Nigeria. (Unpublished M.Ed project), UNN.

Okwudili, M. I. (1996). ment. Predictive validity of university matriculation examination in Nigeria: implication for curriculum development. (Unpublished Ph.D Thesis), Enugu Sate University of Science and Technology.

66

Olagbaju, O. O., & Akinsowon, F. I. (2014). The use of nigeria languages in federal education: challenges and solutions. Journal of Education and Practice, 5(9), 123-127.

Olusola, A. (2006). Avocates of Examiination Malpractice. Retrieved from http://ezinearticles.com/?Advocates-of-Examination-Malpractice&id=292923.

Omirin, M. S., & Ale, V. M. (2008). Predictive validity of english and mathematics mock examination results of senior secondary school students performance in WASCE in Ekiti state, Nigeria. Pakistan journal of social sciences, 5(2), 139-141.

Onyebuenyi, E. N. (2009). Ethnicity, gender and socioeconomic status as moderator variables in the predictive validity of centralized mock examination. (Unpublished Ph.D Thesis), Universtity of Nigeria, Nsukka.

Orji, C. N. (2013). Comparative study of students' achievement in English Language in WAEC and NECO from 2007-2011 in Nsukka Education Zone of Enugu State. (Unpublished M.ED Project), UNN.

Parkes, J. (2000). Reliability as argument. Journal of Educational Measurement and Evaluation, 26(4), 2-10.

Paul, D. (1995). Controlling human heredity; 1865 to the present . Atlantic Highlands: N.J Humanity Press.

Pearson, K. (1930). The life, letters and labors of Francis Galton. London: Cambridge University Press.

Raina, A. O. (2011). Gender sensitivity of recommended Chemistry textbooks for senior secondary schools in Nigeria. (Unpublished Ph.D thesis), University of Nigeria, Nsukka.

Richards, R. J. (2002). The romantic conception of life: science and philosophy in the age of Goethe. Chicago: University of Chicago press.

Robert, L. M., & John, D. B. (2003). The A-Z of social research. London: SAGE Publications Ltd.

Rohwer, W. D., Rohwer, C. P., & B-Howe, J. R. (2000). Educational psychology; training for student diversity. New York: Holt,Rinehart and Winston.

Schumacker, R. E. (2009). Classical Item Analysis International. Journal of Education and Psychological Assessment, 1(1).

Strodach, G. K. (2012). The art of happiness. New York: Penguin Classics.

Ugwuadu, O. R. (2011). Effects of discourse patterns on students' achievement and interest in Biology. (Unpublishes Ph.D theisis), University of Nigeria, Nsukka.

67

Ukeje, B. O. (1997). The challenges of mathematics in Nigeria's economic goals of vision 2010: Implications for secondary school mathematics. A lead paper presented at the 34th Annual National Conferece of the Mathematical Association of Nigeria .

Umaru, F. C. (2005). Issues in applied english language. Nsukka: Chuka Educational Publishers.

WAEC. (2015). Retrieved from http://www.waecheadquartersgh.org/index.php?option=com_content&view=article&id=4 6&Itemid=54.

Walk, R. A. (2005). Evaluating the predictive validity of the speed DIAL version of the DIAL 3, (developmental indicators for the assessment of learning).Unpublished Doctoral Dissertation East Tennessee State University, U.S. Retrieved from http://www.thesesaddissertation.com.

Wilder, R. L. (2001). Evaluation of mathematical concepts. London: Transworld Publishers Ltd.

William, M. K. (2006). Measurement validity type. Retrieved from www.socialresearchmethods.net/kb/measval.php.

Zakka, J. (2014). Innovative strategies for curbing examination malpractices in public examinations in Nigeria. (Unpublished M.Ed project) , UNN.

68

APPENDIX A

NAMES OF THE PUBLIC SECONDARY SCHOOLS IN MOPAMURO L.G.A i. Amuro Community Secondary School. ii. ECWA Secondary School Mopa. iii. Government Day Secondary School Takete-Idde. iv. Orokere Comprehensive High School. v. Local Government Secondary School, Aiyedayo Amuro. vi. Baptist High School, Mopa. vii. Cruise Memorial College, Mopa. viii. Government Technical College, Mopa. ix. Government Science School. x. Local Government Secondary School, Okeagi.

69

APPENDIX B

NAMES OF THE PUBLIC SECONDARY SCHOOLS IN KABBA/BUNU LGA

i. Comprehensive Secondary School, Kabba ii. ST. Augustine College, Kabba i. ST. Monicas College, Kabba ii. ST. Barnabas College, Kabba iii. Community Secondary School, Outu Egunbe iv. Community Secondary School, Kakun v. Federal Government College, Kabba vi. Science Secondary School Okedayo vii. Comprehensive Secondary School Ila viii. Community Secondary School Iluke ix. Kiri High School Akutupa x. Community High School Okebukun xi. Local Government Secondary School Oke – Ofin xii. Community Secondary School Ayede xiii. Community Secondary School Odo Ape xiv. Community Secondary School Ole

APPENDIX C STUDENTS’ GRADE CHART (SGC) Name of School:…………………………………………………………………………………………………………………………………………………...... ….

TE Score/Grade Mock Score/Grade SSCE Score/Grade S/n Name of Student Reg no Math Eng Phy Chem Bio Math Eng Phy Chem Bio Math Eng Phy Chem Bio

70