I would like to include ungraded beta questions in student assessments, for the purpose of collecting "hardness" statistics on these questions for future releases of the exam. I've already created a Question Set of betas from which to randomly draw, and have assigned them a point value of zero in the assessment. But to use beta questions properly, there are three features in BB that need to be configured, none of which appear to be configurable.
#1: The questions should be randomly mixed in with the graded questions. I've posted this issue on another discussion thread, regarding mixing all of the questions from multiple question sets. (For example, randomly draw five questions from each of eight question sets, then mix these 40 questions for random presentation.)
#2: I would like to turn off the display of the point value of each question in the student's view. The assessment currently shows the point value of each question, so it is trivial for the student to see which are ungraded. That will bias the analysis of the beta questions.
#3: I can find no way to receive the students' successes on the individual questions. They can see how they did, question by question (if scores are released to them), but the instructor cannot. I get two email messages: one has the overall score, and the other lists the questions along with the student's answers, but it does not say whether the answer was correct or not. These are all multiple-choice questions, so it should be easy to produce. It appears that I will have to build my own parser to aggregate the individual question statistics. It would be preferable to be able to modify the results that get sent to the instructor, much like the control we have over the results that the students see.
If solutions to any of these has been posted already, please point me to the thread. I have searched deeply.
#1 - As you've seen from the other thread, you can't really mix up questions randomly drawn from multiple pools. The way to randomly select a small number of questions from a pool is to use a random block, and all the questions from a random block are presented together. Within the block the questions are randomized, and the order of the blocks can be randomized, but questions from different blocks can't be intespersed.
#2 - You have some control over the feedback that you give students. You can choose to show them their overall score and nothing else, in which case they will not see the points awarded for each question. But you can't show them the submitted answers, the correct answers, or feedback without them seeing how many points they were awarded for each question.
#3 - Don't rely on the email notifications for a detailed breakdown of student attempts. You need to go to the grade center and look at "Attempts Statistics" for the column to get a breakdown for each question of how many students chose each answer. However, you can only do that if all the students are answering the same questions, so random blocks that select subsets from a question pool are out. If you want to do more detailed statistical analysis, you can choose "Download results".
#1: I was afraid of that - it is what I've seen in other discussions. I was hoping that there is a configuration setting outside of the user menu system, like a parameter to change at compile time. A hidden "feature" if you will.
#2: Let me clarify. I don't want the point value of the question displayed DURING the assessment. In fact, for this particular exam, the student never sees the results. It is pass/fail.
#3: Thanks for the suggestion. I'll look into that. Why did you say to not "rely" on email? Do things go missing?
© Blackboard, Inc.