California State University Dominguez Hills s6
Total Page:16
File Type:pdf, Size:1020Kb
California State University Dominguez Hills Academic Senate Minutes April 22, 2009
Voting Members Present: Berlin, Bordinaro, Carvalho, Chavez, Corbin, D’Amore, Erbe, Fawver, Ferris, Fisher, Furtado, Furusa, García, Gould, Heinze-Balcazar, Hernandez, Jacobs, Kalayjian, Kaplan, Kowalski, Kravchak, Kulikov, Malamud, Murrey, Needham, Niederman, Salhi, Sneed, Wang, Whetmore, and Zugman.
Voting Members Absent: Bohman, Brooks, D, Cauthen, Ganezer, Hancock, Hinchberger, Hwang, Jones, Keville, Ma, Pawar, and Pinto.
Ex-Officio Members Present: Arasimowicz, Bersi, Blue, Bradfield, García, Zitelli for M. Gordon, Hohm, Karber, Maki, Robles, Rodriguez, Saks, Strong, and Vogel.
Guests: Oesterheld
Recorder: Brooks, M.
Call to Order 2:30 p.m.
Approval of Agenda Approved MSP
Approval of Minutes from 4/08/09: Amended MSP
Follow-Up Item-FPC 09-03-Resolution on Shared Governance-Sneed asked the senate to review the change that the Provost recommended. “That any campus structural reorganization broadly affecting-curriculum, budget within academic affairs, university mission or faculty assignments-be referred to the Senate and other appropriate faculty bodies to ensure full campus consultation and joint decision making deliberations”.
Malamud was concerned it was not for all areas like student affairs or administration and finance. Vogel said that faculty governance is under academic affairs. There was a vote and the resolution passed.MSP.
Second Reading Item-EPC 09-04-Resolution on Computer Literacy-Jacobs reported that she collapsed items 1 and 2 and cleared up the language. There was a discussion. Whetmore said that our largest impact are the older returning students. Bordinaro said that older returning students have a problem with technology but can analyze and synthesize the information, and younger students were great with the technology but had difficulty with analyzing and synthesizing information. There was a vote, the resolution passed. MSP.
Time Certain-4:00 p.m. President’s Report-Mildred García-García spend the weekend at Air University in Montgomery, Alabama. She said that she met several presidents who are involved with Air University, including Susan Aldridge, University of
1 Maryland. She told Dr. García that Dominguez Hills has such a wonderful reputation. She knew the diversity of our student body, and she knew who we serve, and said that she wanted her campus to be just like ours. García said that it is great to hear people on the east coast talking about the wonderful work we do her at CSUDH.
That weekend García met with Senior Master Sergeant Jose A. Barraza. He told her that he grew up in San Pedro, Ca. He is a faculty member at Air University. He formerly came from the gang life but then turned his life around when he joined the Air Force. He said that his sister had talked to him earlier that day, and informed him that she had been accepted to CSUDH. García said she was so happy to hear this.
García reported that Dr. Eunice Krinsky had received a $20,000 to 54,000 grant to improve K-12 math/science. She heard this weekend the airforce is looking for students who know about cyber space, aero space, and math and science.
García reported that she is proud of the CSUDH theater department. She said that the cast of “The Heiress” has been selected by the black theater festival to perform there in July. García said that the university is working on raising money to pay for students to get there. This is a wonderful honor.
García reported that 61 of our exceptional student athletes were recognized by the NCAA for maintaining a 3.0 GPA and above, for being athletic and for being involved in the community. She said that John Johnson will be appearing on ABC.
García reported that she met with the synthesizers for the strategic plan. They will be sending out the information they have received. Please be sure to read it and review. Make sure that what you expressed is there.
García reported there was a reception for the Emeritus faculty to meet the new provost, Ron Vogel. This went very well.
She said that the Presidential Scholarship Dinner will be held tomorrow at 6:30 p.m. but that it has been toned down from previous events. So far, we have raised $140,000 in scholarships. García said that we will be honoring former Los Angeles County supervisor Ivonne Braithwaite Burke, a friend of CSUDH. She has arranged for $50,000 endowment to be given to CSUDH student.
García reported that our enrollment is ahead from last year. She said that last year at this time we had admitted 4,841 and this year we already have admitted 8,814. She said that our special admits are down
García reported on financial stability. She said the UBC will be meeting on April 24. She said that we will move forward and will be making difficult decisions. She said that she would much rather be giving out money than figuring out how to cut. She said that we will continue to talk, and discuss, but we will have to make decisions soon. García said we will go through the appropriate processes to do this. García said that Greg Saks
2 and his staff have put together a “Budget Central” on our university website. She encouraged the faculty to take a look and review.
García asked if there were any questions or comments, there were none.
Executive Reports Chair’s Report-Munashe Furusa-Furusa talked about the passing of Roger Berry. He said that we will remember Roger for his innovative teaching, mentorship and outstanding leadership. He said that Roger was an emeriti associate professor of marketing and in the FERP program. Roger Berry’s memorial service will be held May 12, from 3 to 5 p.m. and the funeral service is scheduled for May 15. All students, faculty, administrators are invited by the family to attend.
Furusa thanked Cheryl McKnight for the earth day speaker’s series and the earth day celebration.
Furusa congratulated Donis Leonard, the cast and crew of “The Heiress” for being invited to perform at the National Black Theatre Festival from August 3-8, 2009.
Furusa reported that he attended the Council of Senate Chairs meeting at the Chancellors Office on April 9, 2009. John Tarjan spoke and encouraged campuses not to go over their target because the budgets would then be rebenched. There was discussion about early remediation and how if differs from campus to campus. The goal is to work towards proficiency.
Jeri Echeverria, Associate Vice Chancellor spoke about the role of the academic senate chairs is to recommend policy and the role of the provost is to enact senate policy. The key to success of shred governance is honest dialogue and good communication. The chairs need to meet with the president and provost on a regular basis. We need to listen to the statewide senate and decide with our campus if we want to be involved with issues. The short and long term issues were: budget reductions and maintaining quality; are reductions being made responsibly; and, what is important/t/what do we want to achieve. There was a discussion about online teaching evaluations and what the different campuses are doing.
Parliamentarian Report-Marisela Chavez-Chavez reported that the president appointed Irene Vasquez to be the UBC chair. Chavez reported that John Thomlinson will be the faculty representative on UBC. Chavez encouraged all faculty to remember to vote for the statewide senate senator and the senate chair. She said that an email was sent out to all faculty with the link and their own identification number. She said that if the email was deleted that the election committee can not give out a new one, so please don’t delete the email.
Statewide Senate Report-Caroline Bordinaro-Bordinaro reported that the Academic Affairs Committee had several pending resolutions for consideration. One supports the Doctorate of Nursing Program. There is a vote of no confidence resolution being looked
3 at for administrators other than presidents. A request for honorary degrees for Japanese interment victims, the statewide senate objects to the “mandate” of this and are crafting a bill to support the spirit. We just passed a resolution that a score of 4 is appropriate on the IB exam; however, the new findings are that a score of 5 for the sciences may be more appropriate. The UC system and the CSU system will align scores. The General Education committee has decided that military basic training can be used for area E.
The CLA report is attached at the end of the minutes.
Vice President of Enrollment and Planning Report-Sue Borrego-None
Provost’s Report-Ron Vogel-Vogel passed out data from Institutional Research. DRW Rates, FTEF Report of Graduate Programs, and a demographic report of students.
Vogel said that it takes $350,000 to run the graduate programs. The FTEF report goes to the actual dollars that could be saved. Vogel said that we won’t save a lot by cutting graduate programs and so we really need to think this through.
Vogel said that this information will go up on the website. Vogel was also going to put faculty salaries on the website, but there was concern about putting faculty names with there salaries. There was a vote and it was decided to just put the total salaries without the names.
Bordinaro raised concern about using criteria other than graduation rates. She suggested criteria on how our students impact the community. Carvalho said that we can see what the program costs but what about grants that contribute. Vogel said that these are one time dollars permanent dollars like grants are not considered because they are not permanent. Fawver said that she would like to see caps for each of these programs. Fawver said that the UBC also needs all the salaries with names. Furusa said that he can provide the list. Malamud said that salaries are public and did not understand what the big deal was about, if you want to know someone’s salary just go the the Sacramento Bee.
Vogel said that going after graduate programs is hard and we are being asked to cut. Academic Affairs gives 3 million in release time. We could propose department chairs go down to 3 units of release time. We have to do the hard planning now. Vogel said that a large number of faculty will be FERPing and also leaving FERP. We can save money by not replacing FERP faculty.
Vogel said that he wanted to be as open as possible and asked faculty to send him an email if they have questions or concerns.
Whetmore said that his program graduates between 60 to 70 students per year. He said that they are carrying the load for smaller programs to fly, but is concerned that his program cannot be stretched any further. He said that they are at the breaking point.
4 Vogel introduced Mary Ann Rodriguez and she passed out a budget document that went over the perceived $6 million Bradfield had mentioned in the last meeting. The table explained in detail that the $6 million is not really extra but is already allocated. Rodriguez went through the report and explained the 6 million extra. She said that the report she passed out assumes the initiatives are going to pass but they may not. Bradfield said that he was not trying to have a gotcha moment just needed to clarification. Gould asked where the money from Home Depot Center would show up on the report, and Rodriguez said that money is not included and we are trying to be less dependent on HDC, she said that we get about $500,000 from them each year. Malamud wanted to know why the report did not show all the incoming funds and all the outgoing costs. Vogel suggested that we let the UBC do the necessary research, he was not sure that the senate was the appropriate place to go over this.
Malamud wanted to know why departments keep the salaries of retiring faculty. He said that if a department has a couple of people retiring they have a lot of money. Malamud wanted to know why excess money would stay in a college when there is a budget crisis.
Maki said that in the last 3 years we have had 4 provosts. It used to be that when faculty retired the excess money went to central academic affairs. Sam Wiley allowed the funds to stay in the college and Vogel has only been here 12 weeks and has not had a chance to make a decision.
Berlin said that it seems that we do a lot of things because it is common practice and not policy.
Vogel will be putting up a website soon with the budget information. He asked the senate if they wanted him to post names by salaries or leave the names off. The senate voted and it was decided to just total all the salaries by department and post the total, without the names.
Open Forum: The president of the women’s club here on campus spoke to the senate about the need for the Women’s Resource Center and how 70% of the campus are female students. Furusa reported to her that the Executive Committee has appointed an adhoc committee to address the issues with the Women’s Resource Center.
Fawver reported that the history-link-plus in the library will be suspended as of May 1, 2009. She was concerned because her students use it so much. Ma reported that there are two alternatives “World Catalogue” and “Mango”. Ma said that link-plus was being suspended because of cost.
Adjournment: Meeting adjourned at 5:07 p.m.
5 Report of the CSU Joint Taskforce on the Collegiate Learning Assessment April 2009
Background
The California State University is a participant in the Voluntary System of Accountability VSA), an institutional partnership launched in 2006 by the American Association of State Colleges and Universities and the National Association of State Universities and Land Grant Colleges. The VSA was established in response to recommendations made by the federal Commission on the Future of Higher Education established by then-Secretary of Education Margaret Spellings. The Commission report, “A Test of Leadership: Charting the Future of U.S. Higher Education,” recommended, among other things, that institutions of higher education provide more information to the public about the success of their students, including measures of student achievement on a “value added” basis. The VSA, through its College Portrait, communicates a range of information on the undergraduate student experience through a common web reporting template that institutions place on their home pages and update regularly.
Participation in the VSA requires institutions to use one of three nationally used student surveys to demonstrate student gains during their college years: the Collegiate Learning Assessment (CLA), the Measure of Academic Proficiency and Progress (MAPP), or the Collegiate Assessment of Academic Proficiency (CAAP). For the purposes of the VSA, all three use ACT or SAT scores to set benchmarks for students’ expected performance. Tests must be administered every three years, and scores from the initial administration publicly reported within four years of an institution becoming a VSA participant. The VSA customized reports include average scores for both freshmen and seniors on various test modules, student learning gains on each test module, and the average ACT or SAT scores for students tested.
In August 2007 the CSU Executive Council of Presidents recommended that all CSU campuses participate in the VSA, using the Collegiate Learning Assessment (CLA) as the primary means of measuring cumulative student learning. The Executive Council also recommended a two-year pilot testing period, beginning in 2007-2008, after which a
6 determination would be made about the continued use of the CLA and the frequency of its administration.
The Academic Senate CSU (ASCSU) passed a resolution in January 2008 -- AS-2830- 08/AA, “The Collegiate Learning Assessment (CLA)” -- that called for an assessment of the results of the two-year CLA pilot test by faculty. In late Fall 2008, the Presidents’ Council on Accountability established the CSU Joint Taskforce on the Collegiate Learning Assessment to review the CSU experience with the CLA instrument. Taskforce members include Statewide Academic Senators Glen Brodowsky, San Marcos; Catherine Nelson, Sonoma (Co-Chair); Caroline Bordinaro, Dominguez Hills; and Mark Van Selst, San Jose. Other members include Dianne Harrison, President, Monterey Bay; Sandra Flake, Provost, Chico; Cecile Lindsay, Vice Provost, Long Beach, (Co-Chair); Bettina Huber, Director of Institutional Research, Northridge; Linda Buckley, Associate Vice President, San Francisco; and Lorie Roth and Marsha Hirano-Nakanishi, Chancellor’s Office staff.
The Taskforce was charged with addressing such questions as:
What does the CLA measure, and what do the results mean? What is the best methodology for administering the CLA? What connections are possible between the CLA assessment, authentic assessment of student work within academic programs, and efforts to strengthen degree programs? How might the CLA serve to assess outcomes in general education programs? Should the CLA be the sole instrument used by the CSU in assessing student skills? Are any other available instruments a better match to CSU students and needs?
The Taskforce held four meetings from January through April 2009. This report contains a brief description of each instrument, including their strengths and weaknesses, a discussion of significant themes arising in the Taskforce’s deliberations, and finally, conclusions and recommendations to the Presidents’ Council on Accountability.
The Collegiate Assessment of Academic Proficiency (CAAP) The CAAP was originally designed by the ACT (originally an abbreviation for American College Testing, but changed to simply ACT in 1996) as a General Education outcomes assessment tool. According to the ACT, the CAAP will measure value-added performance gain longitudinally from CAAP to CAAP, or cross-sectionally in a comparison of an initial reading to later scores. It is also portrayed as a useful tool to assist in curriculum development, measure the strengths and weaknesses of GE programs, measure student performance over time and assist in accreditation and accountability reporting requirements. The instrument includes six modules: reading, writing skills, writing essay, mathematics, science and critical thinking. The test can be customized. The VSA version uses the writing essay and critical thinking modules. The writing essay consists of two 20-minute responses to prompts, each requiring a thesis statement and supporting explanation. Two trained evaluators score the essays. The Critical Thinking
7 Module is a 40 minute, 32-item multiple choice test where students are asked to evaluate an argument or statement. The test is updated at last every three years.
The ACT requires that a random sample of at least 200 freshmen and 200 seniors be tested on each CAAP module. Freshmen must be tested early in the fall semester, and seniors within six months of graduation. Both groups are tested in the same academic year, and there is a two-week test window. Institutional norms are compared with other institutions in a given cohort. The standard reporting package includes institution and individual student level data.
Reports from students and administrators indicate that the CAAP has overly long instructions, and that there is not enough time to complete the test. Students often call the written essay topics boring, uncreative and juvenile. An online version is not available, and answer sheets made of low quality paper can mean smudged and wrinkled test sheets. Test administrators may be required to “clean up” the answer sheets before submission. Scoring the exam takes longer with an essay component (about a month). The written test can be scored locally, but finding faculty volunteers to do so is an impediment. If students take the CAAP more than once, they may receive the same test questions. The ACT does not have enough equated forms of each exam to undertake a longitudinal student with more than two testing points.
The Measure of Academic Proficiency and Progress (MAPP) The MAPP is a value added test created by the Educational Testing Service (ETS), and is billed as an integrated test of GE skills. It has been around for about 20 years. It claims to provide a unified picture of GE programs, promote curriculum improvement through actionable score reports, provide a basis for comparison with other institutions nation- wide, and provide flexibility through options for tailoring the instrument to the institution. The test is designed to measure reading, writing, critical thinking and quantitative skills. There is a standard version with 108 multiple choice questions, and an abbreviated version with 36 questions. Standard form questions include 54 critical thinking and reading questions in the humanities, social sciences and natural sciences, and 54 in writing and math. The long form takes 120 minutes, the short form 40 minutes. There is no writing component. Paper and web based versions are available. Institutions have the option of adding a 30-minute essay administered by computer (with a 15 minute introductory tutorial). The essay gauges students’ ability to formulate a position and make an argument. The essay is scored electronically by E-rater software that is calibrated by experts. The VSA uses the short form of the exam. With an essay added, the total time for the exam is 90 minutes.
MAPP provides two types of scoring, norm referenced scores (scaled scores), or criterion scores (proficiency classifications). Norm referenced scores compare one student or group of students to another, or the same student or groups at different points in time (MAPP website). According to ETS, these scores can be used to determine value added gains using cross-sectional or longitudinal studies. Criterion scores measure proficiency levels in a given skills set. Again, according to ETS, criterion scores can be used to measure curriculum improvement or individual student gains. The ETS claims the
8 MAPP has construct validity in measuring underlying skills because it is equated with the former Academic Profile.
Scores available for the standard form of the MAPP include total scores, subscores and proficiency classifications for individual students, and total scores, norm-referenced subscores and criterion-referenced scores at the group level. Scores available for the abbreviated form of the MAPP include total scores, but no subscores, for individual students (not enough data for statistically reliable scores). The institution gets total scores, norm-referenced subscores and criterion–referenced subscores. Fifty valid results are necessary to provide aggregate reports. Institutions can also receive aggregated scores across a three year period. Subscores are also available for the humanities, social sciences and natural sciences.
The MAPP has both strengths and weaknesses. On the one hand, ETS has been around a long time, and has a great deal of experience in the testing field. With a multiple choice exam, you know what you are getting. On the other hand, there is no writing component, and the multiple choice writing portion focuses on structural or grammatical skills rather than the more holistic assessment of writing an argument. This means that the test does not measure the types of skills we expect students to have, and is reductionist because it does not provide students the opportunity to demonstrate their critical thinking skills in a creative way. Adding an essay may create an idiosyncrasy in the test score, which may interfere with the VSA’s emphasis upon regularized scores. In addition, evaluation by E- rater may be problematic.
The Collegiate Learning Assessment (CLA) The CLA was developed by the Council for Aid to Education, an offshoot of the Rand Corporation. The CAE claims that the CLA measures student performance on tasks they will realistically be expected to use in the “real world.” The instrument is promoted as a tool to assist institutions in programmatic change to improve teaching and learning in the areas of critical thinking, analytic reasoning, problem solving and written communication. The level of analysis is the institution.
The CLA consists of two parts, the 75 minute analytic writing segment and the 90 minute performance task. The writing segment is composed of two sections, a “make an argument” piece that asks students to take and develop their position on a prompt, and a “critique an argument” piece that asks students to critique an argument made by someone else. The performance task asks student to prepare a memo based upon a set of supporting documents. The idea is to figure out which documents are useful and which aren’t, and to make an argument based upon the information in the appropriate documents. The essays are now machine-graded, but the performance task is graded by evaluators. SAT or ACT scores are used to establish “expected performance levels,” which are then used to assess whether students are doing better or worse than expected. An alternative 15-minute diagnostic exercise is available for students without a recorded SAT/ACT score.
9 The entire testing process, from the test to scoring to score reporting, is online. Cross sectional administration requires the assessment of 100 freshmen and 100 seniors. The VSA uses the cross-sectional version of the test, in which students complete either the performance task, or the two-part analytical writing task. The CAE has not yet established construct validity. Its website indicates that a construct validity study of all three VSA testing instruments is currently in progress.
Five institutional averages are reported for any given administration: two essay scores, an analytic writing task score, a performance task score, and an overall CLA score. This last is an average of the two task scores for the test group as a whole. Both difference and standardized deviation scores are reported as well. Difference scores compare the SAT-based expected scores and actual task scores or subscores, while deviation scores are standardized versions of the difference scores based upon a compilation of scores on the analytic writing test and the performance task. Once both freshmen and senior cross- sectional administrations are complete, the deviation scores for each are substracted and standardized. These last represent the “value-added” scores for the two tasks and the CLA as a whole.
The CSU now has almost two years experience with the CLA. Information available from 2007-2008 indicates that recruitment protocols are not standardized across the system. Eleven campuses embedded the test in courses. Some courses required the exam, some provided extra credit, some provided students completing the exam with vouchers for on-campus establishments and some made the CLA optional. Other campuses relied solely upon incentives to recruit students to take the exam (gift certificates, vouchers, coupons, academic competition). Some campuses used only persuasion.
Students report that the test is interesting to take, although seniors seem to take it more seriously than freshmen. Students report that they “never had a test like that before,” with such sentiments more frequent for the performance task portion of the exam. They appreciate being asked to use data, contextualize it, and actually say something.
The CLA comes the closest to measuring competencies that students will be asked to use in the real world. The CLA may be a good tool for increasing the use of performance tasks in the classroom. The CLA does not have a specific ethical reasoning, quantitative skills, or information literacy segment, all of which are skills students should be expected to develop in a university setting.
The fact that the CLA is all electronic can complicate administration of the test. The virtue of pencil and paper exams is that students can walk into their regular classroom and take the test. Dependency upon a computer lab may mean limiting the number of students that can take the test or are willing to come to a special room for that purpose. Sampling is also a problem, in that it is difficult to get a truly random sample with a voluntary test or with recruitment protocols currently in place. Initially, there were some technical difficulties in setting labs up for CLA administration, but these were worked out several years ago. Since then, administration has proceeded smoothly.
10 General Themes Several significant themes arose during the Taskforce’s lengthy deliberations about the relative merits of the three tests. The issue that stood out the most was the appropriateness of using current value added testing methods as either accountability or assessment tools. There was general agreement that they were not suitable for program assessment as presently constituted, administered, and reported. Other issues arose as well:
Given the number of variables that could contribute to students’ improvement on an exam over time, it is difficult to establish that the institution is solely responsible for any “value added” in students’ scores.
Devising recruitment protocols that will guarantee a random sample of subjects for a test can be problematic, especially across a 23 campus system.
The relative merits of longitudinal vs. cross sectional analysis were discussed, with the first seen as the more reliable and the second as the only practical option.
Whether or not to recommend that the test remain voluntary for students and voluntary for faculty in terms of embedding the test in their courses.
The validity of recent criticisms leveled at the CLA (see Perelman). These are two-fold: (1) the test has been described as “data lite,” meaning that the testing context doesn’t reflect the lengthy data-gathering students undertake in preparing class assignments or will have to undertake in future employment settings; and (2) the ease with which students can “scam” the essay portions by writing things that may be factually inaccurate, but elegantly expressed
The possibility of the VSA testing process becoming a backdoor for mandated learning objectives is troubling.
Conclusions and Recommendations The Taskforce ultimately decided to recommend continued use of the CLA to the Presidents’ Council on Accountability. While none of the currently available options are adequate to the task at hand, the CLA does have potential. It seems to engage students more that the CAAP or the MAPP, and student motivation in testing is crucial. The CLA also attempts, albeit imperfectly, to measure “real world” tasks that students will be called upon to complete, and some of the initial test-takers have rated the performance task, in particular, as a valuable challenge. Standardizing recruitment protocols across the system would reduce the current variation in administrative procedures.
While the CLA tells us almost nothing about assessment, given the current political environment, it does provide a measure of accountability. And, CAE repeatedly emphasizes that it is only one measure among many that should be used to establish
11 institutional performance. Finally, CAE representatives are willing to meet with the CSU to explore tailoring the test to the system’s needs.
The Taskforce therefore makes the following formal recommendations:
#1: Continue using the CLA under the auspices of the Voluntary System of Accountability.
#2: Convene a taskforce of system test administrators and faculty to develop standardized implementation protocols for use throughout the system.
#3: Delay renewed administration of the CLA for one year (i.e., during 2009-10). After that, require campuses to administer the test a minimum of once every three years.
#4: Characterize the CLA as an accountability instrument rather than as an assessment instrument. (Assessment is evaluation intended for program improvement while accountability is evaluation intended to demonstrate accomplishments to external groups.)
#5: Arrange a meeting between representatives from the CSU and CAE to discuss possible changes/additions to the CLA to tailor it to the CSU’s needs (e.g., additional attention to ethical decision-making and an additional module focusing on quantitative skills).
Sources
Trudy Banta, “Can Assessment for Accountability Complement Assessment for Improvement?,” AAC&U peerReview (Spring 2007), 9-12.
Caroline Bordinaro, Dianne Harrison, Sandra Flake, “CAAP: Information and Analyis,” Powerpoint Presentation to CLA Taskforce, February 18-2009.
Stephen Klein, Richard Shavelson and Roger Benjamin, “The Collegiate Learning Assessment: Facts and Fantasies,” Draft Paper, January 26, 2007, not to be cited without permission, 1-21.
Ou Lydia Liu, “Measuring Learning Outcomes in Higher Education Using the Measure of Academic Proficiency and Progress (MAAP) Test (September 2008), ETS RR-08-47.
Les Perelman, “Problems in Evaluating Four-Year Colleges,” MIT Faculty Newsletter, Vol XXI, No. 1, September-October 2008, url: http://mit.edu/fnl/volume/211/perelman.html, accessed March 27, 2009
Susan Platt, “Comparison of CLA, CAAP & MAPP.” Presentation to CSU Joint Taskforce on the Collegiate Learning Assessment, March 17, 2009.
12 David Shulenburger, George L. Meharry, and Christine Keller, “The Voluntary System: Responding to a New Era,” Liberal Education (Fall 2008), 48-53.
“A Test of Leadership: Chartering the Future of Higher Education,” A Report of the Commission Appointed by Secretary of Education Margaret Spellings, September 2006; url: http://www.ed.gov/about/bdscomm/list/hiedfuture/reports/final-report.pdf, accessed March 27, 2009
“Learning to Reason and Communicate in College: Initial Report of Findings from the CLA Longitudinal Study,” Report of the Social Science Research Council, n.d.
No Author, “2007-2008 CSU Administration of CLA: Campus Practices.”
“Voluntary System of Accountability (VSA): Information on Learning Outcomes Measures,” Excerpts from Working Papers of Learning Outcomes Technical Work Group, updated by Presidential Advisory Committee, (January 23, 2008), 1-8.
Websites
Council on Aid to Education: Collegiate Learning Assessment http://www.cae.org/content/pro_collegiate.htm
ETS: MAPP – Measure of Academic Proficiency and Progress http://www.ets.org/portal/site/ets/menuitem.1488512ecfd5b8849a77b13bc3921509/? vgnextoid=ff3aaf5e44df4010VgnVCM10000022f95190RCRD&vgnextchannel=f98546f 1674f4010VgnVCM10000022f95190RCRD
ACT: CAAP – Collegiate Assessment of Academic Proficiency http://www.act.org/caap/
The taskforce developed the following statement about its charge:
“Given that the PCA has committed to participate in VSA, one requirement of this commitment is the measurement of certain learning outcomes. These outcomes are expected to be general and common across the system and represent a common set of skills that represent part of the learning outcomes of a CSU education.* The task force is charged with evaluating the efficacy of the CLA, in comparison to the two other accepted measurement instruments in measuring these skills. *All of these skills are part of a subset of skills, approved by the ASCSU as part of the General Education program and are the building blocks of academic work throughout CSU curricula.”
13 14