Anchor(Top)? Navigation(children)? TableOfContents?
Purpose
The purpose of this node is to review the usability, pedagogy and architectural related issues associated with objective item assessment iDevices in eXe (in this case the Multi-choice and True-False iDevices).
Problems
- The True-False iDevice is not intuitive.
- Cannot distinguish between feedback for correct and incorrect responses in the T-F iDevice
- Pedagogy of iDevices can be improved with regards to instructions for the question (eg. "Indicate which of the following statements are True")
- Inefficient code - currently we have two independent iDevice code engines for the MCQ and T-F iDevices
- Reuse and options for reconfiguring existing items is limted.
- iDevice pane could become cluttered and confusing if the number of question types increases.
Questions
- Can the generic code engine of the MCQ be refined and adapted to handle the MCQ format, T-F format and possibly other types of object item assessment?
- Can one published instance of an MCQ or T-F iDevice contain more than one question?
- Can one published instance of an object item assessment iDevice contain more than one question type? (For example a MCQ question followed by a True false question in the same published instance of the question "container" concerned.)
- Should the Cloze iDevice be considered as a question sub-type?
- Can elements of a published iDevice in a package be reconfigured for publishing elsewhere in the package? (For example Can MCQ's used during the teaching phase be reconfigured for a quiz at the end of the lesson? During the teaching phase, hints, feedbcak and multiple attempts are permitted wherease in the quiz situation no hints or feedback are provided and the number of attempts is limited.)
- Should eXe be able to import/export questions from one package into another package? --- Does this mean an XML file format?
- While eXe has not specified QTI compliance for Release 1.0, should our architecture be designed in a way that will facilitate QTI compliance in the future?
- Should eXe include a randomisation feature in its code engine for objective items? Many quiz editors include the capability of randomizing the published sequence in which the alternatives are displayed. At first glance this may not appear to be a worthwhile feature because of our standalone HTML export. However, this could be useful for example: Multiple exports from the same .elp so that last year's students don't get the same order as this year's students, or when an item used for teaching purposes is reconfigured for a quiz in the same package.
- What other types of objective item assessment can be achieved using the MCQ code engine? - For example, can the MCQ code engine be used for matching lists?
Anatomy of an objective item assessment iDevice in eXe
Within eXe, an objective assessment iDevice is characterised by distinctive components. These are:
- Pedagogy (Best practice associated with providing learners clear instructions as to what they should do, providing feedback on student responses and best practice on the formulation of good assessment items.)
- Technical (What is the technical definition of an iDevice? Can iDevices contain or inherit the code of other iDevices?)
- Presentation (Layout and presentation of the editing interface and the student's published view. This includes the student's view for different contexts as well as the published view for different formats - for instance Web, PDA print etc. Also the publishing of more than one question under one instance of the MCQ or True False ICON).
The difficulty from an architectural design perspective is that these components are not discrete - they are interrelated in unique ways depending on the specfic use context. (The interrelationships between the components of a MCQ in a quiz situation are different from those of a MCQ used in the teaching material - hence the complexity associated with the notion of what a container is.)
Consequently, a use case analysis is helpful in specifying the ideal technical behaviour.
Use case analysis for a MCQ
What is the use case difference between a MCQ used for teaching and an MCQ used in a quiz?
<rowbgcolor="#FFFFE0">Use case | Pedagogy and display issues | Implications for Print version |
Teaching (Formative assessment) | 1. Display instructions 2. Provide optional hint 3. Provide feedback on correct and incorrect answers 4. Permit multiple attempts of the same item | 1. Display instructions in situ; 2. Hints provided elsewhere (eg back of publication) 3. Feedback published elsewhere |
Quiz (Summative assessment) | 1. Display instructions 2. Do not show hint 3. Do not provide feedback on correct and incorrect answers 4. Specify number of permissible attempts 5. Allow changes to answers before pressing submit button 5. Track Score | 1. Display instructions in situ; 2. Hints provided elsewhere (eg back of publication) 3. Feedback published elsewhere |
Issues
The following questions may help us understand the fundamental difference between a quiz and an item used for teaching purposes, and hopefully the difference between an iDevice and a container.
<rowbgcolor="#FFFFE0">Question | Tentative answers |
Can a MCQ authored for teaching context be reconfigured for use in a Quiz? | Yes this should be possible. This would have implications for how quiz items are selected or generated dynamically as a quiz at the end of a section, unit etc. |
Can authors generate quiz items independently? | Yes authors should be able to generate new items for a quiz. |
Can a quiz item be reconfigured for use in the teaching text? | Yes, this should be possible - however we must think carefully about how we manage the implementation of the additional requirements of a objective item used in a teaching context (the need for hints and feedback which may not be specified in a quiz item). |
Can a quiz contain different types of item in the same quiz? | Yes, MCQs T-F and other types of items should be able to be included into a quiz. |
Can an "question" iDevice used for teaching purposes contain different types of objective assessment items? | Yes, this should be possible. |
Tentative conclusions
- A container is something that "contains" one or more discrete reusable chunks of code - for example the functions of a single MCQ question or T-F item.
- Containers are differentiated from each other by the generic behaviours and requirements of a particular use case context. For example, the "Quiz" container will allow users to change their answers to a set of questions until pressing the "submit" button. However, the Question container will provide immediate feedback without the requirement for a submit button.
- In most cases, users will not be concerned with the distinction between iDevices and containers. Consequently it will be OK to include a container in the iDevice pane, although at a architectural level iDevices and containers are very different.
Thoughts on requirements for containers and refinements to existing MCQ, T-F and SCORM Quiz iDevices
- Replace the current MCQ and T-F iDevice with a generic Question container that is listed in the iDevice pane.
- Display the new container as a Question "iDevice" in the iDevice pane - i.e. don't have separate iDevices for MCQ, T-F etc.
- Users should have the ability to include one or more questions within the Question container.
- A list of questions is published as a single "iDevice" i.e. with one icon followed by the list of questions.
- Users should have the ability to include more than one type of question within the Question "iDevice" container.
- We should explore the use of a separate editing window for different types of questions, similar to the iDevice editor window. For example a different editing window for MCQs and T-F items.
- If the already existing within the current package, provide the ability to incorporate Quiz items within the Question container, but users will be required to add hints, feedback etc.
- Consider the ability to import/export question items between different eXe packages in Release 1.0 - not excluding the possibility for import/export of question items from other authoring tools after Release 1.0.
- Consider the implementation of an XML file format.
- Reconfigure the MCQ code engine for the T-F questions.
- Include an optional random sequencing capability for each time a package is exported. This would also be valuable when items are reused in different containers.
- Refine the existing SCORM Quiz iDevice to become a Quiz container that appears in the iDevice pane.
- Display the new container as a Quiz "iDevice" in the iDevice pane.
- Requirements are the same as for the Question container above, except for the differences specified below:
- Do not need to include hints and feedback when creating questions in the Quiz container - in the print version the correct option must be indicated.
- Number of permissable attempts before pressing the submit button must be specified.
- In the published version a submit button must be provide before score is calculated.
User case study on the implications of using the MCQ code engine for the T-F question type
It should not be too difficult to adapt the MCQ code engine for the T-F item question type. I used the Multi-choice editor to create the following T-F item, thus demonstrating that the MCQ code engine can function rather well for T-F items, with minor adaptations. I changed the label from Multi-Choice to True-false question. I used the text area for the instructions and statement of the question. Naturally I had to type in "True" and "False" strings as options, but now had the capability of providing feedback for both the correct and incorrect alternatives. This is currently not possible with the T-F iDevice.
inline:T-F_usingMCQeditor.jpg
Issues
- If a generic question container is implemented, we no longer require a label indicating the question type. This is self-evident from the question itself.
- While it is possible to add pedagogically appropriate instructions like - "Indicate which of the following statements are true" in the text area box - we should consider implementing this as a default text that users can edit.
- Publishing layout will need to be adapted for a true-false iteration when using the MCQ code engine.
- We should consider a user specified option of different variants of the true-false string for example: Correct or Incorrect; Agree or Disagree.
Requirements for refining the MCQ code engine for a T-F question become more evident when considering the editing view of the MCQ item for T-F questions:
inline:T-F_usingMCQeditor_editview.jpg
Anticipated refinements
- The string "Indicate whether the following statement(s) are true or false" should be provided as a default text in a text area window used for intructions.
- we should include the ability to add more than one true-false statement, and think about the following implementation issues:
- Optional auto-numbering of questions
- How to deal with the iDevice label in singular versus plural situations
- How do we manage the default text relating to pedagogical instructions in the singular versus plural situtation
- Incorporating the feature for variants of the true-false string (Correct-incorrect and Agree-disagree) as well as thinking about corresponding changes to the default pedagogical instructions.
- Regarding alternatives, the text area should be replaced with a radio button implementation of the selected variant (true-false, correct-incorrect or agree-disagree).
- Retain the radio button for correct answer.
- Provide a text area for feedback on correct and incorrect responses.
- If possible, Display the default feedback of "correct" or "incorrect" in the text area box once the correct answer has been selected. Users should be able to edit the default feedback text.
Other question types that the MCQ could manage
These could include:
- True-False, see discussion above.
- MultipleResponse item (Where the correct answer includes more than one reponse. For example: Which of the following food types are associated with an English breakfast. (1) Eggs (2) Carrots (3) Bacon (4) Beans (5) Ice cream.
- OrderingQuestion where users are requested to specify the correct sequence of a list of options.
Additional question types that may require additional code engines
- Drag and Drop matching list (could use generic Flash container and parse XML from eXe into the swf container).
- Cloze type fill in the box used in conjunction with an image map iDevice (Useful for labeling diagrammes etc.)
- Hybrid Cloze question that uses pull down menu bar for missing words.
Resources to consult
- IMS QTI specification see overview document - http://www.imsglobal.org/question/qti_v2p0/imsqti_implv2p0.html
Attachments (2)
- T-F_usingMCQeditor.jpg (37.6 KB) - 2009-05-22.
- T-F_usingMCQeditor_editview.jpg (81.5 KB) - 2009-05-22.