The use of the general achievement test in student selection at : A preliminary evaluation Dr Clare Hourigan University Planning & Statistics, Office of Planning & Quality, Monash University

Introduction In 2008, Monash University commenced a 3 year pilot program to use General Achievement Test (GAT) scores as supplementary information to the ENTER (Equivalent National Tertiary Entrance Rank) which is used to select students who apply to the University on the basis of their Victorian Certificate of Education (VCE). The University wanted to improve its ability to select students who would succeed at university and believed that an aptitude test could provide additional information about a student’s potential.

The catalyst for this was a 2004 UK government instigated review of university admissions, and the subsequent recommendation of the use of aptitude tests for admissions to UK universities. This recommendation was partly based on the findings of a pilot of the Scholastic Assessment Test (SAT). This study showed that some students scored well enough on the SAT to be considered for selection at top ranked universities even thought they hadn’t obtained the necessary secondary schooling results (McDonald, Newton & Whetton, 2001).

The GAT was chosen because it had the advantage over other aptitude tests, such as the UniTest trialled at Monash’s Berwick campus in 2007, in that it is undertaken by all students doing their VCE and therefore no extra burden would be placed on students and administrative overheads would be minimised .

Additionally, extensive analysis was conducted by Professor Rob Hyndman and Dr. Muhammad Akram at Monash University to measure the predictive value of GAT results on first year marks. The study was undertaken using the results of 4,270 students commencing in 2004. It was found that overall a student’s ENTER is the best predictor of their first year marks, accounting for 35.4% of the variance when used on its own although this varies by course. However, the addition of GAT scores improved predictability by about 3% and the effect was significant (Hyndman & Akram, 2006, p. 11). Additionally, when used on its own, the GAT was also found to be a strong predictor accounting for 32.3% of the variance in average marks (Hyndman & Akram, 2006, p. 12). The predictive nature of the GAT was also stronger for students with lower ENTERs suggesting that the GAT may be most useful for students on the borderline of receiving a Monash offer.

An extensive consultation process then commenced with key stakeholders, including the Victorian Minster of Education and the Victorian Curriculum and Assessment Authority. Additionally a joint Monash/Victorian Tertiary Admissions Centre (VTAC) discussion paper was sent to all Victorian

1 secondary schools. There was little objection to Monash’s proposal to pilot use of the GAT, for three years, as additional information for consideration in middle band selection (which is where approximately 20% of places are assessed on a range of extra criteria for applicants with ENTERs below the ‘clearly‐in’ score).

The main reasons for piloting the GAT were:

1. To differentiate between middle band applicants to identify those most likely to succeed at university.

2. To assist in identifying and selecting applicants who were likely to succeed at university but had underperformed in Year 12 due to personal circumstances or educational disadvantage.

This paper provides a preliminary evaluation of the GAT pilot to monitor the extent to which it is meeting its aims.

Background

Increasing participation and diversifying the student population while maintaining quality The 2008 Review of Australian Higher Education highlighted the important role that the higher education sector has in preparing Australia for the demands of a global economy but found that the sector is not keeping up with current and future labour market demands for university qualified employees (Bradley, Noonan, Nugent & Scales, 2008). The Review recommended that Australia increase participation in higher education and, because participation varies across society, there should be a focus on under‐represented groups (Bradley, et al., 2008, p. 45).

The Australian Government has responded by setting targets for low socio‐economic status (SES) student participation (20% of the undergraduate population by 2020) and bachelor level qualification attainment among 25‐34 year olds (40% of 25‐34 year olds by 2025). To provide assistance, the current cap on Commonwealth supported university places for domestic students will be removed by 2012 (Department of Education, Employment and Workplace Relations, 2009a).

The Government response encourages and provides the opportunity for universities to grow and diversify their domestic student populations. While this provides greater flexibility in setting strategic directions around education provision, it also raises some challenges. How do universities ensure growth and diversification is achieved without compromising on quality education? This challenge is likely to be a focus of the newly established Tertiary Education Quality and Standards Agency (TEQSA) and the upcoming cycle of audits. The Government has explicitly indicated that while the growth of the sector is a major goal, ‘institutions will be required to demonstrate that their graduates have the capabilities that are required for successful engagement in today’s complex world’ and that student academic performance will be a focus of TEQSA (Department of Education, Employment and Workplace Relations, 2009b).

Admission policies will play an important role in assisting universities to reach the Government participation targets. To maintain quality, universities will need to ensure that any growth or

2 diversification of the student population is only obtained by recruiting students who are academically able to successfully complete tertiary studies. If universities attempt to reach the targets by recruiting students who are unlikely to succeed at university even with additional support, they will run the risk of setting students up for failure or devaluing their degrees. This might also create a university student population that is diverse in ability rather than social background which can in turn create new teaching and learning challenges and impact negatively on the student experience.

The use of Year 12 results in selection and why this may play a role in the under­ representation of low SES students Currently, the primary tool used by Australian universities to select school leaver students is Year 12 results. Research has shown that this is the best tool currently available (Hyndman & Akram, 2006) but Year 12 results are not a perfect predictor of university performance and may be affected by personal circumstances and student background.

One of the main factors hindering university participation and qualification attainment among under‐ represented groups, such as those from low SES backgrounds, is that academic achievement at the secondary level is often lower among under‐represented groups. Year 12 completion rates improve with socio‐economic status (James, Anderson, Bexley, Devlin, Garnett, Marginson & Maxwell, 2008) as do ENTERs (Birrell, Rapson, Dobson, Edwards & Smith, 2002, p. 14). It follows then, that as Year 12 results are the primary selection tool used by universities, students from low SES backgrounds are less likely to obtain a place.

There are a range of theories that try to explain why students from low SES backgrounds under‐perform in their secondary studies. James (2002, p.141) argues that their aspirations to attend university are lower and this is due to less encouragement from parents and less confidence in their own ability (James, 2002, p. 50). Additionally, and of some concern for the education system, his research also found evidence that low SES students were much less likely that high SES students to feel that their teachers encouraged them to aim for university (44% compared to 58%) (James et al, 2008, p. 37).

Other researchers have specifically focused on the impact of schooling and shown that, in , Year 12 results obtained by Government schooled students fall behind those from Independent and Catholic schools (Edwards, Birrell and Smith, 2005), p. 10 and p. 20). They argue that school resources play an important role in how well a student performs in Year 12 and that students who cannot afford to attend Independent schools, or do not have access to a high performing Government school , may not have the same opportunities to do well enough in Year 12 to obtain a place at university.

As stated in the final report of the Admissions to Higher Education Review undertaken in the United Kingdom (2004, p. 23), the university system cannot ‘be responsible for compensating for social disadvantage or shortcomings in other parts of the education system’. If Government schools are not sufficiently preparing students for university then this should be dealt with at the secondary level rather

3 than by universities. However, further research has suggested that Year 12 results among students from Government schools may under‐reflect their potential to succeed at university. In a study of students commencing their studies at Monash University in 2000‐2003, Dobson & Skuja (2002, p.59) found that, students who had attended non‐selective Government schools performed better than students from Catholic, Independent and selective Government schools when ENTER was controlled for. They argued that private schools are able to ‘add‐value’ and that as a result ‘bright students from non‐selective Government schools are disadvantaged in Year 12’ (Dobson & Skuja, 2002, pp 59‐61). Similar findings were made by Naylor & Smith (2002) in a UK study comparing A level results, school type and degree performance.

Part of the challenge for universities in selecting students that are able to perform at university is to find a way of compensating for disadvantage to the extent that it has resulted in the achievement of Year 12 results that are not an accurate indicator of an individual’s likelihood of success at university.

In Victoria, the most common approach used to compensate for disadvantage in selection is the Special Entry Access Scheme (SEAS) supported through VTAC. Applicants can apply for additional consideration under a number of different equity categories including a category covering socio‐economic disadvantage and are provided with bonus aggregate points (up to 12 at Monash) if they can show, through an impact statement and other evidence, that they have faced economic hardship that has impacted on their ability to reach their educational potential. The assessment of these applicants is done in a systematic and consistent way but it is, to some degree, a qualitative and subjective measurement.

Some universities also try to account for the impact of schooling on Year 12 results by providing consideration to applicants who attend a school that is under‐represented at university (Victorian Tertiary Admissions Centre, 2009). However, this is a broad brush approach which assumes all students at the schools selected have been disadvantaged in the same way and therefore need to be compensated in the same way.

The role of aptitude tests In 2006, Monash University began considering aptitude tests as a tool to provide supplementary information about an applicant’s academic abilities and, as outlined in the introduction decided to use the GAT. It was believed that the GAT might be able to assist in the identification of students who had the academic ability to succeed at university but who had obtained Year 12 results that under‐reflected their potential. Secondly, it was thought that it could be used to differentiate academic ability between students on the borderline of receiving an offer. Given the research outlined above, a secondary outcome may be that participation of students from under‐represented groups such as those from low SES backgrounds will improve.

It is argued that aptitude tests are also influenced by external factors linked to socio‐economic status such as schooling (Stringer, 2008, p. 55). However, The Admissions to Higher Education Review undertaken in the United Kingdom recommended that the use of aptitude tests for university admissions be examined as they may ‘help uncover hidden talent’ (2004, p. 49). This recommendation

4 was partly based on a UK study examining the relationship between A‐level results and the results of the Reasoning Test component of the Scholastic Assessment Test (SAT) (a test commonly used in the United States for university admission) (McDonald, et al, 2001). This research found that while both measures appear to have a relationship with socio‐economic status (i.e. students with high SES backgrounds perform better on both measures) and there was a significant relationship between A levels and SAT results, the two measure different aspects of academic ability. They found that some students would achieve high results on one measure but not on the other, and that students from low attaining schools were much less likely to score well on both tests (21% compared to 44%) (McDonald, et al, 2001, p. 30). They also pointed out, that if university selection was based on a student obtaining a good result in either or both their A levels or SAT test that this would almost double the number of students from low attaining schools that would be eligible for an offer while increasing the eligibility of students from high attaining schools by only 53% (McDonald et al 2001, p.31).

Description of use of GAT in selection While the GAT pilot commenced at Monash in 2008, there has been some adjustment in the use of GAT scores between 2008 and 2009 and so the analysis shown here focuses only on admissions for 2009. In 2009, the university used achieved ENTERs and GPENTERS (ENTERs predicted from GAT scores) to assist in the selection of applicants. The GPENTERs were provided by VTAC using a statistical model developed by Rob Hyndman based on data for Monash’s VTAC applicants in 2008.

The GPENTER was used during middle band selection which is where approximately 20% of places are set aside for assessment on a range of extra criteria including bonuses for SEAS applicants or for those have performed well in relevant VCE subjects. Applicants are assessed on middle band criteria if they achieved an ENTER (or equivalent) that was below the estimated clearly‐in score for the course at the time of assessment, but above the eligibility score for the course2.

2 The clearly‐in score for a course is the ENTER score point at which all applicants above received an offer. Applicants given an offer after an assessment of middle band criteria will have usually achieved an ENTER below the clearly‐in, although the clearly‐in score can shift once middle band selection has been made. The eligibility score for a course is the lowest ENTER score accepted by the university for that course. Applicants with an achieved ENTER below this cannot obtain an offer even if middle band bonuses raise their ENTER above this point. The only exception to this rule is for SEAS applicants.

5

In 2009, GAT bonuses (between 2 and 9 points) were given to applicants if their GPENTER was higher than their achieved ENTER. Table 1 outlines the bonuses. These bonuses were added to the applicant’s achieved aggregate3 score so that their ENTER could be recalculated for reassessment.

Table 1: Aggregate bonus points assigned to middle band applicants at Monash University in 2009 for applicants who’s GAT predicted ENTER was higher than their achieved ENTER

Difference between Bonus Aggregate GAT predicted Points ENTER and achieved ENTER 0.1 – 5.9 2 6.0 – 10.9 4 11 – 15.9 6 16 – 20.9 8 21 9

Results The analysis outlined here provides a preliminary evaluation of the pilot. It focuses on applicants who applied for a place at Monash in 2009 directly after completing their VCE, and for whom ENTER scores and GPENTERs were available.

The key goal of the pilot was to find a way of improving the selection of students who were likely to succeed at university, and so this issue is examined.

Given the targets recently set by the Government for participation among under‐represented groups, and the body of literature arguing that Year 12 results can sometimes under‐reflect the academic potential of students from low SES backgrounds, the analysis also examines the extent to which GAT scores are influenced by SES compared to the ENTER. It also investigates whether the pilot resulted in more students from low SES backgrounds being selected for an offer.

3 The aggregate score is basis for calculating the ENTER for VCE students. It is the sum of the top four subject scores (including English) plus 10% of any fifth or sixth subjects. The subject scores are first scaled to adjust for difficulty. The ENTER is then calculated from a percentile ranking of the aggregate scores. More info can be found in the ‘ABC of Scaling’ document published by VTAC: http://www.vtac.edu.au/pdf/publications/abcofscaling.pdf

6

Did the GAT pilot assist in selecting applicants that were likely to succeed at university?

In 2009, a total of 163 VCE school leavers received a final Monash offer as a direct result of the GAT pilot and of these, 118 enrolled. To examine the extent to which these students are academically able, the average marks and progress rates (% of units passed) were analysed (see Table 2). The analysis is very preliminary as it is only based upon first semester results. Further analysis will be necessary before firm conclusions can be made and it will be important to track these students throughout their degree to ensure that the findings shown here are not overly influenced by transition issues.

Overall, students who obtained a GAT offer performed at a lower level than other students. Their average mark was 60.4 compared to 66.7 for the total group. Similarly, the GAT students passed 82.9% of their units on average compared to a 91.4% progress rate for the total group.

However when the GAT students are compared to other successful middle band applicants and therefore had an ENTER that was below the clearly‐in for the course the differences are much smaller (average marks of 60.4 compared to 61.9 and progress rates of 82.9% compared to 87.2%). While the GAT students still achieved results that were lower than the other middle band students, the differences were not statistically significant.

Table 2: First semester average marks and progress rates among domestic VCE School Leaver students commencing a bachelor’s pass degree at Monash University by their offer type, 2009

Average Std Progress Std Type of offer Mark N Error Sig* Rate N Error Sig* GAT based offer (A) 60.4 118 1.41 82.9 121 2.89 Standard ‐ applicant above clearly in (B) 67.1 2733 0.24 A C 91.6 2754 0.41 A C Middle band ‐ applicant below clearly in (C) 61.9 424 0.7 87.2 428 1.31 Range of Criteria or clearly in not available (D) 69.8 548 0.44 A B C 95.8 551 0.67 A B C Total 66.7 3823 0.21 91.4 3854 0.36 *Lettering indicates where the measure is significantly greater at the 0.05 level

7

To what extent are GAT scores influenced by socio­economic disadvantage? In support of previous work on socio‐economic status and Year 12 results, ENTER scores do improve on average as socio‐economic status improves4 (Figure 1) and this applies to all VTAC applicants and to the sub‐group apply to Monash University – although ENTER scores among Monash applicants are higher. This pattern also holds for GPENTERs although it is less marked (Figure 2). Among Monash’s applicants, GPENTERs were slightly lower than ENTERS but the difference was greatest for high SES applicants. As a result, there is slightly less variation between the average achieved ENTER of high and low SES applicants and the GPENTER (12.76 points compared to 10.43) suggesting that the GAT may adjust for some of the impact of disadvantage.

90 81.08 (n = 9799) 75.92 80 74.37 (n = 20830) (n = 15423) 72.12 (n = 8002) 68.32 67.45 70 (n =2666) 64.12 (n = 39067) (n = 17473) 59.35 (n = 6043) 60 R TE N E 50 an e M 40

30

20

10

0 High Medium Low Total*

Socio‐economic status of applicant's postcode *includes applicantsfor whom there was no socio‐ economic status information Monash Applicants All VTAC applicants

Figure 1: Mean ENTER for Monash’s VCE school leaver applicants and all VTAC VCE school leaver applicants by socio‐ economic status of the applicant’s postcode, 2009

4 Applicants were classified into high, medium and low SES according to their home postcode. This classification was provided by the Department of Education Employment and Workplace Relations and is based on the Australian Bureau of Statistics 2006 SEIFA index (Socio‐economic Index for Areas) with an adjustment for estimated resident population data for 15‐64 year olds. The low SES postcodes are those in the bottom 25% while high SES postcodes are those in the top 25%.

8

90

81.08 75.92 80 77.68 74.17 72.12 72.14 68.32 70 67.25

60 R TE N E 50 an e M 40

30

20

10

0 High (n = 9799) Medium (n = 8002) Low (n = 2666) Total* (n = 20830)

Socio‐economic status of applicant's postcode *includes applicantsfor whom there was no socio‐ economic status information Achieved ENTER GAT predicted ENTER

Figure 2: Mean achieved ENTER and GAT predicted ENTER for Monash's VCE school leaver applicants by socio‐economic status of the applicant's postcode, 2009

The differences in ENTERs and GPENTERs across the three SES groups may simply reflect true differences in academic ability and potential to succeed at university. To test this further, two regression models were built to measure the relationship between SES and university performance while controlling for the impact of ENTER or GPENTER. This analysis was undertaken not to find the best predictor of university performance, but to help identify the extent to which ENTER or GPENTER under‐reflect the academic potential of low SES students. University performance was measured using first semester average marks among VCE school leavers that commenced at Monash in 2009. The analysis was based on the performance of 3,823 students.

The following models were developed:

ENTER MODEL: Average Mark = 0.96 + 0.74xENTER + 2.28xLow SES + 1.95xMedium SES + e1 (2.1) (0.02) (0.65) (0.41)

GPENTER MODEL: Average Mark = 41.1+ 0.31xGPENTER + 0.82xLow SES + 0.41xMedium SES + e1 (1.41) (0.02) (0.7) (0.44)

9

Both models were statistically significant although ENTER was a better fit than GPENTER with a larger R Square value (0.21 compared to 0.08) and a slightly smaller standard error (11.32 marks compared to 12.18).

The influence of SES status in the two models is quite interesting. In the ENTER model, SES was significant and the coefficients were quite large. The model results show, that when the effect of ENTER is controlled for, a Low SES student will, on average, have an average mark 2.28 points higher than a High SES student (plus or minus 0.65 in about two thirds of cases). Medium SES students also performed better (1.95 marks higher than high SES students plus or minus 0.41) suggesting that the extent to which Year 12 results, as a predictor of university performance, are disproportionately affected by external factors may be similar among students from both medium and low SES backgrounds. Perhaps the Year 12 results of students who are financially well off are inflated rather than the results of low SES students being deflated.

In the GPENTER model, the variation in average marks between Low, Medium and High SES students was found to be not significant. This suggest that while the GPENTER is not a particularly strong predictor of university performance on its own, it is less likely to under‐reflect the performance of low and medium SES students at university. As a result, in consultation with the ENTER, the use of the GAT may help adjust for any undue influence of SES on ENTER results.

Did the GAT pilot assist in diversifying Monash’s cohort of commencing students with regards to socio­economic status? An analysis of the differences between GPENTERs and ENTERs among individuals showed that, of the 20,830 VCE school leavers that applied to Monash in 2009, 41.7% (8,690) had a GPENTER that was greater than their achieved ENTER (See Figure 3 and Figure 4) and that this was more likely among low SES applicants than high SES applicants (46.2% compared to 35%). Additionally, the mean difference between the GPENTER and ENTER was significantly different between these two groups.

Interestingly, it was those from medium SES backgrounds that were most likely to have obtained a GAT result that indicated that their ENTER might under‐represent their academic ability (48.6%) and the distribution shown in Figure 3 has a very similar pattern for both medium and low SES applicants. The mean difference between the GPENTER and ENTER for these applicants was significantly higher than those from both the high and low SES groups. These findings further support the results of the regression analysis and indicate that the impact of personal or external factors on Year 12 results for low and medium SES groups might be similar as they have similar patterns of variation between ENTER and GPENTER.

10

25.0%

20.0%

15.0% ts an lic p ap f o % 10.0%

5.0%

0.0% ‐40 ‐35 ‐30 ‐25 ‐20 ‐15 ‐10 ‐5 0 5 10152025303540More

GAT predicted ENTER minus Achieved ENTER

High Medium Low

Figure 3: Distribution of the difference between GAT predicted ENTER and achieved ENTER for Monash's VCE school leaver applicants by socio‐economic status of applicant’s postcode, 2009

To be eligible for selection, Monash applicants had to meet an eligibility score criteria for their preferenced courses with their achieved ENTER or a SEAS bonused ENTER. This is a ruling specified by DEEWR. Any GAT bonusing could not be used to bring the applicant’s ENTER above the eligibility score as the use of the GAT was not part of a special entry scheme. When the analysis was restricted to these applicants, a much smaller proportion (28.7%) were able to obtain a GAT bonus.

Due to the eligibility score rule, eligible applicants living in low SES postcodes were only slightly more likely to qualify for a GAT bonus than those from high SES postcodes (28.2% compared with 26.2%). Many of Monash’s low SES applicants did not achieve Year 12 results that were high enough to meet the minimum course requirements.

11

100.0%

90.0% R TE N E d ve 80.0% ie ch a ir e h 70.0% t an th r e 60.0% ta re g 48.6% ER 46.2% T 50.0% (N = 8002) N (N = 2666) E 41.7% d te (N = 20830) ic d e 40.0% 35.0% r 32.9% p (N = 9799) T (N = 4420) A 28.2% 28.7% G 26.2% h 30.0% (N = 1236) (N = 13140) it (N = 7234) w ts an lic p 20.0% ap f o % 10.0%

0.0% High Medium Low Total*

Socio‐economic status of applicant's postcode

*includes applicantsfor whom there was no socio‐ All applicants Eligible for selection economic status information

Figure 4: Proportion of Monash’s VCE school leaver applicants (all applicants and those eligible for selection) who had a GAT predicted ENTER that was greater than their achieved ENTER by socio‐economic status of the applicant’s postcode, 2009

To examine this further, the offer outcomes were examined. The analysis was restricted to those in the middle band because the GAT was only used for these applicants. It includes all applicants that were above the eligibility score except those who received a standard offer and had an ENTER above the clearly‐in score for the course, or received an offer for a course that uses a range of critieria or for which no clearly‐in was available.

In total, of the 5002 final offers made to VCE School leavers in 2009, 705 were middle band offers. Despite the findings shown in Figure 4, a slightly larger proportion of both low and medium SES middle band applicants obtained a GAT offer than high SES applicants (2% and 2.3% compared to 1.6%). However, the only statistically significant differences were between the GAT offer rate of the medium and high SES applicants.

It seems that while the GAT pilot in 2009 helped to increase offers among low and medium SES middle band applicants more so than high SES applicants, the impact was minimal. It can be concluded that, in 2009, the GAT pilot did not assist Monash in selecting a more diverse cohort of middle band applicants with regards to socio‐economic status. However, this is predominantly due to the eligibility score ruling

12 and the fact that Monash’s eligbility scores are all at an ENTER of 70 or above. Many of Monash’s low SES applicants fell below these scores, which ruled them out of selection. At other institutions, with lower eligibility scores, the results may differ.

100%

90%

80%

ts 70% an lic p p 60% a d n a 50% b le d id 40% m f o % 30%

20%

10%

0% High (N = 4661) Medium (N = 3107) Low (N = 842) Total* (N = 8811) No offer ‐not GAT bonus eligible 67.9% 61.5% 66.3% 65.5% No offer ‐ GAT bonus eligible 24.4% 30.3% 24.9% 26.5% Other middle band offer ‐ not GAT bonus eligible 3.1% 2.5% 2.7% 2.8% Other middle band offer ‐ GAT bonus eligible 1.8% 1.6% 1.5% 1.7% SEAS offers 1.1% 1.7% 2.5% 1.6% GAT offer (includes GAT/SEAS offers) 1.6% 2.3% 2.0% 1.8% Socio‐economic status

GAT offer (includes GAT/SEAS offers) SEAS offers Other middle band offer ‐ GAT bonus eligible Other middle band offer ‐ not GAT bonus eligible No offer ‐ GAT bonus eligible No offer ‐not GAT bonus eligible

Figure 5: Application outcome of Monash’s VCE School leaver applicants that met the eligibility score criteria and were in the middle band by socio‐economic status of the applicant’s postcode, 2009

Conclusion The findings shown here reflect a very preliminary evaluation of Monash’s GAT pilot. Using only one intake’s worth of data and one semester’s worth of university results, the analysis cannot be used to confirm the success or otherwise of the program. Further analysis and monitoring will be required as the pilot progresses.

However, the results are promising. The GAT has provided Monash with a useful additional tool for student selection and 163 students who would otherwise not have been offered a place were selected. Without this pilot these students may not have received a university offer at all or may have received an offer in a course or at an institution that was lower on their preference list. Additionally, semester 1 results show that these students performed as well as other students selected on middle band criteria.

13

Given the Government focus on improving participation among low SES students, and research suggesting that Year 12 results among low SES students are more likely to under‐reflect true academic potential, analysis was undertaken to examine whether or not the use of the GAT in selection might assist universities to meet the government targets while maintaining quality. The ENTER is the strongest predictor of university performance currently available to universities. However, it does appear to over‐ reflect academic potential among high SES students and/or under‐reflect it among low and medium SES students. The GPENTER on the other hand is not as strong a predictor of university performance but it is not influenced by SES. As a result, a selection process involving both the GAT and the ENTER may help mediate for any disproportionate effect of SES on Year 12 results.

When the differences between GPENTER and ENTER were examined among Monash’s applicants it was found that low and medium SES students were much more likely to have obtained a GPENTER that was greater than their ENTER. Therefore, any use of the GAT in selection should pick up a larger proportion of low and medium SES students than using the ENTER alone. In the case of Monash however, while the GAT pilot resulted in the selection of a larger proportion of medium SES applicants from the middle band than high SES applicants, it did not assist in selecting a larger proportion of low SES middle band applicants. This was due to the eligibility score ruling as many of Monash’s low SES applicants, and particular those with a GPENTER greater than their ENTER, were not eligible for selection as they did not meet the minimum entry requirements.

As the pilot progresses, Monash will continue to monitor and evaluate its progress. It will be interesting to see how students selected on the basis of their GAT results progress through their course. Further research could also focus on more specific drivers of underperformance in Year 12 beyond SES status, and the potential of the GAT to mediate for these impacts. These could include school type/resources, or demographics such as education or income levels among parents which are likely to be better indicators of SES than applicant postcode. Unfortunately, the analysis shown here could not examine those issues due to a lack of available data.

14

References

Birrell, B., Rapson, V., Dobson, I., Edwards, D., & Smith, T. F. (2002) From place to place: School, location and access to university . Clayton: Centre for Population and Urban Research, Monash University.

Bradley, D., Noonan, P., Nugent, H., & Scales, B (2008) Review of Australian Higher Education. Department of Education, Employment and Workplace Relations. Canberra. Retrieved 6 October, 2009, from http://www.deewr.gov.au/HigherEducation/Review/Documents/PDF/Higher%20Education%20Review_ one%20document_02.pdf

Department of Education, Employment and Workplace Relations (2009a) Future Directions for Tertiary Education. Retrieved 6 October, 2009 from http://www.deewr.gov.au/HigherEducation/Review/Pages/FuturedirectionsforTertiaryEducation.aspx

Department of Education, Employment and Workplace Relations (2009b) A National Quality and Standards Agency – Fact Sheet Retrieved 6 October, 2009 from http://www.deewr.gov.au/HigherEducation/Documents/RTF/09_FactSheet_A%20national%20quality%2 0and%20standards%20agency.rtf

Dobson, I., and Skuja, E., (2002) Secondary schooling: Tertiary entry ranks and university performance, People and Place, vol.13, no. 1 pp.53‐62.

Edwards, D., Birrell, B., & Smith, T. F (2005) Unequal Access to University Places: Revisiting Entry to Tertiary Education in Victoria. Clayton: Centre for Population and Urban Research, Monash University.

James, R., Andrerson, M., Bexley, E., Devlin, M., Garnet, R., Marginson, S., & Maxwell, L (2008) Participation and equity: A review of the participation in higher education of people from low socio‐ economic backgrounds and Indigenous people. Canberra: .

Hyndman, R. J., & Akram, M. (2006) The predictive value of GAT scores on first year performance, Department of Econometrics and Business Statistics Consulting Service, Monash University. (Unpublished report)

McDonald, A. S., Newton, P. E., and Whetton, C., (2001) A pilot of aptitude testing for university entrance. The Sutton Trust, London.

Naylor, R., and Smith, J., (2002) Schooling effects on subsequent university performance: evidence for the UK university population. Warwick Economic Research Papers, No. 657. Retrieved 7 October 2009 http://www2.warwick.ac.uk/fac/soc/economics/research/workingpapers/publications/twerp657.pdf

15

Shwartz, S., (2004) Fair admissions to higher education: recommendations for good practice, Admissions to higher education review. Retrieved 7 October, 2009‐10‐07 http://www.admissions‐ review.org.uk/downloads/finalreport.pdf

Stringer, N., (2008) Aptitude tests versus school exams as selection tools for higher education and the case for assessing educational achievement in context, Research papers on education Vo. 23, No. 1 March 2008 pp. 53‐68 Victorian Tertiary Admissions Centre (2009) Special Entry Access Scheme. Retrieved 8 October 2009 http://www.vtac.edu.au/pdf/publications/seas.pdf

16