BizEd

MarchApril2010

Issue link: https://www.e-digitaleditions.com/i/56065

Contents of this Issue

Navigation

Page 36 of 67

The MFT Pros: Easy to use, measures overall student performance Cons: Doesn't offer comparable data, doesn't identify individual areas of weakness The MFT is designed to assess the knowledge of business students in core business areas such as accounting, econom- ics, finance, international business, and business law. But while it helps faculty identify functional areas where students need more curricular development, the MFT also has three significant limitations. First, the MFT is "normed"—that is, it compares stu- dents' scores only to those of other test takers in that semes- ter. For that reason, we cannot compare how well students performed this semester to how well they performed last semester. Second, the MFT reports scores in terms of per- centiles, not in terms of questions answered correctly. For example, if students are in the 90th percentile, it does not mean that they answered 90 percent of the questions cor- rectly, only that they performed better than 90 percent of that round's group of test takers. Finally, the MFT does not report the functional area specific scores for each student, so it's impossible to analyze their strengths and weaknesses in these areas. The Homegrown Test Pros: Makes comparisons possible, tests performance in any core concept Cons: Expensive to implement, time-consuming IMPLEMENTING THE IN-HOUSE TEST Date of test administration Average percent of correct answers Percentage of course grade in the capstone course Type of pre-test intervention A homegrown core concept test can overcome many of the limitations of the MFT. For example, the test that faculty designed for IUN is not normed, and it provides scores for each student in each functional area. For this reason, longitudinal comparisons of student scores can measure the reliability of the test itself. That is, if our faculty implement an intervention strategy in a function- al area in a semester—say, for instance, they add a new simulation game to the core course in finance—they can measure its effectiveness by comparing the scores before and after the intervention. Also, IUN's test is flexible enough to test students on any core area. For instance, the MFT does not test stu- dents in ethics and operations management, two core areas we identify in our list of learning goals, but our own test does. Despite the advantages of a homegrown test, schools may encounter faculty resistance to implementing one because it requires such a huge time commitment. Our faculty spend considerable time writing, editing, refining, piloting, and revising test questions. To win faculty over, administrators must clearly communicate how important the test is to the assurance of learning process. It helps to identify individuals who will be responsible for handling and maintaining the data, and even to appoint an "assessment captain" to cham- pion the assessment process. It also helps if schools encourage faculty who contribute to the design and implementation of the test by provid- ing more opportunities for development. Providing a wider Fall 2007 51.6 0 No pre-test intervention Summer 2008 50.7 0 Dean speaks with students about test's importance Spring 2009 54.4 10 Instructor talks to students about the test, and sample questions are made available Summer 2009 59.3 20 Instructor talks to students about the test, and sample questions are made available Student performance on an in-house assessment test improved after faculty at the Indiana University Northwest's School of Business and Economics implemented interventions to better inform students about and prepare them for the test. BizEd MARCH/APRIL 2010 35

Articles in this issue

Archives of this issue

view archives of BizEd - MarchApril2010