BizEd

MarchApril2008

Issue link: http://www.e-digitaleditions.com/i/57338

Contents of this Issue

Navigation

Page 28 of 83

reward faculty for more than their research." —Douglas Eder of the University of North Florida computer architecture to support a company's new Web site. In the assignment, the company's marketing depart- ment knows the number of daily users for the site, but can- not provide detailed projections for site use. Students work independently to determine likely use patterns, choose architectural components to support that use, and justify their choices. Before students begin the assignment, we tell them that it will be used for assessment. Students also are given the crite- ria that will be used to measure successful responses. Sharing the assessment process with our students has had unexpected advantages. Students tell us that they appreciate knowing what is expected of them, not only in IT proficien- cy, but also in assignments for critical thinking and written and oral communication. Better yet, they note that this con- sistent feedback has helped them improve. Taming the Data As we fine-tune STEPS, we have developed greater flex- ibility in the way we collect, store, and access data that we generate in these assignments. Students can go online and follow posted instructions to upload their completed assignments into the STEPS system. Once assignments are in the system, a coordinator uses STEPS to select a random sample of work and assign it to evaluators for double-blind evaluation. In turn, evaluators receive e-mail notification that the work is ready to evaluate. Evaluators can assess their assigned work remotely and enter their assessments into the STEPS software. At the same time, the assess- ment coordinator uses STEPS to monitor the evaluators' completion progress. Concurrently, evaluators meet to calibrate their assess- ments and make sure they evaluate each student response using the same rubrics. For example, before industry review- ers review the assignment, they first do a sample assessment so that they can address any questions about the application of our chosen criteria. We originally conceived STEPS as an assessment tool for oral and written assignments. It has become much more, helping us collect more than 1,000 assignments and control the countless moving parts in the assessment process. Even- tually, the system will house all of our assessment data. Raising Awareness In "Assessing Student Learning: Are Business Schools Mak- ing the Grade?," an article published recently in the Journal of Education for Business, Kathryn Martell notes that the weakest link in the assessment process is "closing the loop." That is, many schools gather and even carefully analyze the data, but they then fail to take the last step: making improvements based on their analyses. To address this issue, Chico is revising its curriculum development to incorporate more direct input from the assessment process. For example, based on the analysis of the assessment data, our board for IT proficiency found that we needed to provide students with a better exam- ple of IT architecture. Its industry members offered to restructure the assignment given to students to assess their proficiency in applying current technology to a case study. We also take care to look at the data from different angles. For example, if we were only to combine data from all students and all evaluators, it might hide weaknesses and strengths in subsets of the data. So, we have found it useful to compare students by criteria such as chosen spe- cialization and transfer status. It was only by making such smaller comparisons that we found that accounting students were receiving lower assessments in written communication than MIS stu- dents, particularly in mechanics. This revelation came as a great surprise to the accounting faculty. With the hard data before them, they did not argue with the results; they focused on how to get more writing assignments added to the accounting classes to improve their students' skills. To improve writing proficiency, the advisory board rec- ommended changes in the number of assignments and a wider distribution of handouts that cover the expectations of great writing and the common errors to avoid. The col- lege also hired a writing tutor. Since making these interven- tions in spring 2007, we have seen significant improvement in our students' sentence structure, paragraph structure, word choice, tone, and professional format. Ironically, at the same time we actually recorded decreases in their scores in grammar, punctuation, and spelling. But we view these discoveries as part of the process—we're still digesting these results and plan further interventions. With STEPS, we've worked to make assessment as non- intrusive for our faculty as possible. In the process, more of our faculty have become aware of assessment and have become more interested in raising the quality of student learning in their courses. Such an environment only makes it easier to pursue better assessment methods—and greater learning outcomes for all of our students. Gail Corbitt is a professor of accounting and MIS and the director of the Center for Information Systems Research at the College of Business at California State University, Chico; Steve Adams is a professor of accounting; and Ken Chapman is a professor of finance and marketing and the director of the school's Assurance of Learning Initiative. BizEd MARCH/APRIL 2008 27

Articles in this issue

Archives of this issue

view archives of BizEd - MarchApril2008