Academic Exchange QuarterlySpring2012 ISSN 1096-1453 Volume 16, Issue 1

To cite, use print source rather than this on-line version which may not reflect print copy

format requirements or text lay-out and pagination.

 

This article should not be reprinted for inclusion in any publication for sale without author's explicit permission. Anyone may view, reproduce or store copy of this article for personal, non-commercial use as allowed by the "Fair Use" limitations (sections 107 and 108) of the U.S. Copyright law. For any other use and for reprints, contact article's author(s) who may impose usage fee.. See also electronic version copyright clearance

CURRENT VERSION COPYRIGHT © MMXII AUTHOR & ACADEMIC EXCHANGE QUARTERLY

 

An Assessment Plan to Study Student Learning

 

Carla Beeber, CUNY Kingsborough Community College, NY

Loretta Brancaccio-Taras, CUNY Kingsborough Community College, NY

 

Carla Beeber, Ed.D. is an Associate Professor and Loretta Brancaccio-Taras, Ph.D. is a Professor in the Dept. of Biological Sciences.

 

Abstract

This paper describes the process by which a departmental assessment plan was designed and implemented.The data gathered and analyzed from our first semester General Biology course are presented.Four outcomes, microscope usage, writing a lab report, graphing, and bibliography development were assessed.†† Assessment data analysis was used to modify the course to improve student performance.Statistically significant improvements were found with the microscope usage and writing a laboratory report outcomes after instituting course modifications.

 

Introduction†††††††††††

Over the past several years, educators have started to examine more effective teaching methods in order to improve student performance.Critical thinking and problem-based learning have been adopted in many schools across the country. Unfortunately, as evidence of learning is being evaluated, these innovations have had little effect on student learning (Jones 2002).As a result of these findings, a call has been made for a more extensive assessment of what students are learning and how they learn.

 

Standardized tests, such as the Collegiate Learning Assessment (CLA) and the National Survey of Student Engagement (NSSE), were instituted in higher education to assess student learning as well as to make institutions of higher learning accountable for student performance (McCormick 2007).The Spellings Commission recommended national standardized tests;the idea was rejected by colleges and universities.A study of standardized tests at the University of Washington concluded that standardized tests neither measure studentsí learning nor help faculty improve instruction (Hoffman Beyer & Gillmore 2007). Banta (2007) reported that there is little information available about the reliability and validity of standardized tests. The question remains whether these exams truly measure studentsí ability to solve problems, think critically, demonstrate analytical reasoning skills or display disciplinary knowledge (Schulman 2007).†††††

 

The goal of establishing effective assessment methods can be complex and challenging, There is a need to move away from traditional tests which assess pure content knowledge and establish learning outcomes that examine skills students require to achieve lifelong learning.One of the main problems encountered in establishing assessment practices is the need for faculty to reexamine classroom practices. The most successful assessment practices result from the combination of faculty collaborating and supporting one another with administrators providing professional development (Sato & Atkin 2006/2007).The assessment tools created need to be aligned with the content and skills students are expected to master in a particular course or academic program (Martone and Sereci 2009).For example, the Biology Department at James Madison University developed a multiple choice instrument, Natural World-9, to determine the ability of biology majors to evaluate evidence and analyze data (Hurney et al. 2011).Elkman and Pelletier (2008) stress the significance of colleges and universities banning together to use instruments, such as the NSSE and CLA, and report the lessons they learned from these efforts.These lessons included involving faculty, using multiple measures and sharing findings.†† In 2011, Arum and Roksa present data on the lack of learning on college campuses measured by the CLA.Ultimately, the most important part of the assessment process is using the collected data to make informed revisions in courses and programs (Reed 2011).

 

This paper will describe the development, implementation, and the results of an assessment plan initiated by the Department of Biological Sciences at Kingsborough Community College (KCC). Changes to the curriculum resulting from assessment data analysis, as well as faculty perceptions of the assessment process, are described.††

 

Assessment Process

KCC, one of the seventeen units of the City University of New York, is an urban community college in Brooklyn, New York.The Department of Biological Sciences has twenty-three full-time faculty, and serves students interested in pursuing an Associate in Science (A.S.) degree in Biology.In 2002, KCC made a commitment to develop assessment plans throughout the college.As part of this commitment, faculty from various departments were sent to the American Association for Higher Education Assessment Conference.Two members of the Department of Biological Sciences attended the conference and then formed the departmentís assessment committee along with four faculty volunteers.

 

The first task the departmental committee assumed was the development of a resource guide, consisting of information about assessment terms, techniques, plans and reference articles.The completed document was distributed to all department members in preparation for the development of a departmental assessment plan in November 2002.At several department meetings, the material was discussed and faculty began to establish student learning outcomes for all courses in the department.Also, the departmental assessment committee devised a timeline for developing assessment plans at the course, departmental and program level.

 

In order to assess the departmentís programs, two committees were formed in April 2003.One committee focused on the assessment of the students in the allied health programs, which include students pursuing an A.S in Biology with a concentration in occupational therapy, physician assistant and pharmacy.The second committee focused on the assessment of students pursuing an A.S. in Biology with no concentration in a health related field.These students, for the sake of the assessment committees, were considered Biology majors.

 

The Biology Majors Program Assessment Committee, consisting of nine faculty members, had its first meeting in May 2003 to develop program goals.Based on these goals, measurable activities were selected from the two-semester General Biology course sequence (Bio 13 & 14) and the Biology elective courses, (Comparative Anatomy, Developmental Biology, General Microbiology, Marine Biology, Ecology, Invertebrate Zoology, Botany, and Genetics).These activities were presented to the department as outcome maps in October 2003.For each activity, an assessment tool was developed and a passing score for the assessment tool was set.The committee members reviewed the assessment tools and agreed to use these tools to assess student learning.Collection of assessment data began during the spring 2004 semester.

 

The Allied Health Program Assessment Committee, consisting of ten faculty members, met in May 2003 to develop program goals.Based on these goals, measurable activities were selected from the two semester Human Anatomy & Physiology course sequence and the Microbiology in Health and Disease course.These activities were presented to the department as outcome maps in March 2004.For each activity, an assessment tool was developed.The committee members reviewed the assessment tools and agreed to use these tools to assess student learning.Collection of assessment data began in September 2004.

 

Developing assessment tools was a collegial effort.Although difficult at times, the comments and concerns of all faculty were taken into account so that implementation of the assessment tools would be successful.After a series of meetings of the Biology Majors Assessment Committee, four skills were selected to be assessed:ability to focus a microscope; writing a laboratory report; construction and analysis of graphs; and writing a bibliography.From spring 2004 to the present, all faculty members are expected to use the assessment tools provided by the course coordinators.Course coordinators also gave recommendations for the week of the semester the assessment should be carried out

 

At the semesterís completion, instructors submit their assessment data to the course coordinator.These data include the number of students enrolled in the course, the number of students who took the assessment, the number of students who passed the assessment, and the week the assessment was administered.The course coordinator pools the data from each section and determines the percent pass rate for each assessment for all course sections.

 

 

Results

Although assessment data is being collected from all courses in the Dept. of Biological Sciences, the focus of this paper is on the first semester course of the one year General Biology sequence.For the use of the microscope, students needed a score of 85% to pass.66.9% of the students passed this assessment.For writing a laboratory report, passing was a grade of 75% or better.66% of the students passed this assessment.Another skill measured was the ability to develop a bibliography. For this assessment, 69.8% of the students assessed received a grade of 85% or better.†† Finally, studentsí ability to construct and analyze graphs was examined;73.1% of the students received a score of at least 75% on this assessment.

 

In May 2006, the Biology Majors Assessment Committee met to review the aforementioned assessment data.Based on the data and faculty perceptions, changes were made in Bio 13 in an attempt to enhance student learning.To improve studentsí ability to use the microscope, more microscope activities were incorporated into Bio 13 and these activities were uniformly spaced throughout the semester.

In order to improve studentsí ability to write a laboratory report, the committee recommended instructors scaffold the laboratory report so that students would receive feedback and have the opportunity to revise their report prior to its final submission.The committee also decided to include more graphing activities as well as analysis of graphed data throughout the semester in Bio 13.Finally, for the assessment of studentsí proficiency in preparing a bibliography, a handout for developing a bibliography, including proper citation formats was distributed to students in all sections of Bio 13.These course modifications were initiated for the fall 2006 semester.

 

Based on these course modifications, Bio 13 assessment data was collected over five semesters, fall 2006 to fall 2008.For the ability to use a microscope, the percent passing was 80.8%, compared to 66.9% obtained for fall 2004 through spring 2006.For the graphing assessment, the percent pass obtained after the course modification was 74.0% compared to 73.1%.A slight increase from 69.8% to 73.5% was seen in studentsí ability to develop a bibliography.Student performance improved in their ability to write a laboratory report from 66.0% to 83.1%.

 

Statistical analysis of the results was conducted using a Chi Square Test.The difference in the proportion of students passing before the curriculum revisions versus after the curriculum revisions was statistically significant for the use of the microscope, (the chi square value obtained with degrees of freedom equaling one was 25.06, yielding a p value = <.0001), and writing a laboratory report (the chi square value obtained with degrees of freedom equaling one was = 37.902, resulting in a p value = <.0001) while it was statistically insignificant for the graphing outcome (the chi square value obtained with degrees of freedom equaling one was .0087, p value= .7685) as well as the development of a bibliography (the chi square value obtained with degrees of freedom equaling one was 1.582, p value = .2085).

 

Discussion ††††

Based on the assessment data presented in this paper, the changes implemented in the first semester General Biology improved student learning for two of the four skills assessed. The greatest improvement was observed in the writing of laboratory reports.The scaffolding of the laboratory report over a series of weeks, with feedback provided to students multiple times, had a positive effect on the overall quality of these reports. In addition, it provided students with some insights on the recursive nature of the writing process.†† Slight increases in studentís ability to develop a bibliography were also observed.Future plans include a session on the use of the libraryís databases and Refworks by one of the collegeís librarians.However, the courseís additional microscope and graphing activities, did not improve studentsí performance on these tasks. Further analysis of the assignments associated with these two skills and faculty classroom practice is required.

 

Although the assessment data gathered did not lead to improved student learning for all measured skills, it was extremely useful in our departmental assessment process.Developing a departmental assessment process, as well as collecting and analyzing student performance data, led to an open dialogue about teaching and learning.Through the years in which the assessment plan was developed and modified, faculty became less resistant to the process.This may be due to the fact the assessment plan was created by the faculty, without imposition by stakeholders. In addition, the process was a transparent one, with information supplied to the faculty, students, and administrators.†† As has been the experience of other institutions who have working on assessment plans, their success requires instructors view these plans as a continuous, integrated activity that critically analyzes student learning without bias, not as added busy work.Ultimately, the greatest benefit of assessment is achieved through a systematic review of student learning, and not as a method to evaluate faculty or institutional performance (Casey 2004; Peach, Mukherjee, & Hornyak 2007;Zadra 2007).

 

In our department, we implemented assessment measures where the results were a depiction of the specific skills students developed as a result of the courseís learning activities.It is evident additional modifications to the assessment plan will be required.The results offered information regarding studentsí performance in the introductory biology course and provided the faculty with the opportunity to make curricular changes based on the evidence of student learning.Since the assessment process is an ongoing endeavor, the use of consistent, summative assessment instruments, administered to all students, might present a clearer understanding of what students actually learn about biology.Smith (2007) recommends the use of formative and summative assessment techniques and comments on the difficulty of using formative assessment as a predictor of success on summative assessment.††

 

In addition to using a mixture of formative and summative assessment measures, our assessment process could be even more effective if both quantitative and qualitative methods are used (Hoffman Beyer & Gillmore 2007).†††† For instance, questioning students through surveys, focus groups or interviews can help faculty understand why students drop a course or change their major.This type of analysis allows instructors and administrators to view the complexity of the college experience from the studentís point of view.Often, quantitative and qualitative assessment can deepen our understanding of what students are learning and of how they learn.

 

Quantitative and qualitative assessments should be performed at the college level as well as the departmental level so that the unique aspects of the discipline are retained.Students are affected differently by the learning involved in various disciplines.While a few students come to college with a clear vision of their future career plans, many need guidance in choosing a field that matches their skills. It is only when quantitative and a qualitative data from all levels are link that faculty and advisors can better guide students about their future goals and choices.With better advisement, educational settings will generate a society of capable individuals with the skills to solve the problems facing the world.

 

Conclusion

The assessment plan described in this paper resulted from the collaboration of faculty in a large department at an urban community college.The plan was faculty driven, but supported by the collegeís administration.The plan took several years to develop and will no doubt continually change in the future.A significant part of the plan was the implementation of consistent assessment tools for four skills in the General Biology I course.As assessment data was gathered for all course sections, modifications were made to the activities the course to help improve student success on the identified skills.This process of ďclosing the assessment loopĒ is probably the most significant part of any assessment plan.Statistical analysis of the collected data identified the areas still requiring further reform.In addition, an examination of student perceptions about their learning may assist in providing insights that can help faculty develop strategies to improve studentsí success.

 

References

Arum, R. & Roksa, J. 2011.Academically Adrift:Limited Learning on College Campuses.The

††††††††††† University of Chicago Press.

 

Banta, T.W. 2007.Can assessment for accountability complement assessment for

improvement?.PeerReview, Vol. 9, No.2: 9-12.

 

Casey, K.M. 2004.Greater expectations:teaching and assessing for academic skills

and knowledge in the general education history classroom.The History

Teacher Vol. 37, No. 2: 171-181.

 

Ekman, R., & Pelletier, S.2008.Assessing student learning:a work in progress.Change.

Vol. 40, No. 4:14-19.

 

Hoffman Beyer, C. & Gillmore M.G. 2007. Longitudinal assessment of student

learning: simplistic measures arenít enough. Change Vol. 39, No. 3: 43-47.

 

Hurney. C., Brown, J., Griscom, H.P., Kancler, E., Wigtil, C., & Sundre, D.2011.Closing the

††††††††††† loop:involving faculty in the assessment of scientific and quantitative reasoning skills of

††††††††††† biology majors.Journal of College Science Teaching.Vol. 40, No. 11: 18-23.

 

Jones, E. A.2002.Myths about assessing the impact of problem-based learning on students.

††††††††††† The Journal of General Education.Vol. 51, No. 4:326-334.

 

Martone, A. & Sereci, S.G.2009.Evaluating alignment between curriculum, assessment, and

††††††††††† instruction.Review of Educational Research.Vol. 79, No. 4: 1332-1361.

 

McCormick, A.C.2007.First, do no harm, Carnegie Perspectives††

http://www.carnegiefoundation.org/perspectives/sub.asp?key=245&subkey=

2349.

 

Peach, B.E., Mukherjee, A., & Hornyak, M.2007.Assessing critical thinking:a

collegeís journey and lessons learned,Journal of Education for BusinessVol. 82, No. 6: 313-320.

 

Reed, T.E., Levin, J., & Malandra, G.H.2011.Closing the assessment loop by desgin.Change.

††††††††††† Vol. 43 No. 5:44-52.

 

Sato, M. & Atkin, J.M.2006/2007. Supporting change in Classroom Assessment.

Educational Leadership. Vol. 64, No. 4: 76-79.

 

Schulman, L.2007.Counting and recounting:assessment and the quest for

accountability,Change. Vol. 39, No. 1:, 20-25.

 

Smith, G. 2007†† How does student performance on formative assessments relate to

learning assessed by exams?J. Coll Sci Teach. Vol. 36, No. 7: 28-34.

 

Zadra. P. 2007.Assessment:the long road.BizEd . Vol.6, No. 6: 40-42.