This is David Hiersekorn's Typepad Profile.
Join Typepad and start following David Hiersekorn's activity
Join Now!
Already a member? Sign In
David Hiersekorn
Recent Activity
I don't agree with the methodology. They accept California as they hardest exam, which is a fair assumption. Then, they compare each school's pass rates in other states to that school's pass rates in California. For example, if Harvard grads pass 76% of the time in California and 81% of the time in Arkansas, then Arkansas is 5% "easier." If you take that method and apply it across all schools, you get a "bonus" factor. They used that bonus factor to determine the relative difficulty of each state's exam. However, that method is really measuring each state's ability to attract the best students from top schools. The better a state does at attracting top talent, the "easier" they will rank on this methodology. To illustrate this, suppose that Harvard graduates 100 people. Suppose that the top 50% of Harvard's graduates take the New York exam and the bottom 50% take the Arkansas exam. The top 50 would do well in New York, even in spite of it's difficulty. After all, the top 50% of Harvard grads are likely to pass ANY exam. On the other hand, if there are any "failers" in the Harvard batch, they will be in the bottom 50%. So, those failers take the Arkansas exam and fail. Poof! Arkansas is harder than New York. Adding California as a benchmark doesn't change the inherent flaw in the method. It only presents a more complex version of the same flaw. It makes this a measure of the likelihood of any given state to attract candidates from a school who are better/worse than the candidates that same school sends to California. That methodology won't stand. A better approach would be to use raw MBE scores and compare pass rates against MBE scores. You could, for example, compare the average MBE score among passers. I believe that would be a better and more accurate measure.
David Hiersekorn is now following The Typepad Team
Apr 5, 2013