The Discrete Option Multiple Choice™ question format (DOMC) may be one of the most important testing innovations our industry has experienced in several decades. Why would I make such a bold statement, particularly when there have been some wonderful innovations, such as computerized adaptive testing, the use of simulations in tests, new analyses to detect test fraud, and many others? Why might the DOMC™ item type rank among those? Here are a few good reasons:
These and other benefits are explored in the Advantages section below.
The Discrete Option Multiple Choice question type (DOMC) represents a relatively simple but very useful change in the delivery of multiple choice item content on computerized tests. Instead of providing all of the options at one time to the test taker, as is usually done, options are randomly presented one at a time along with YES and NO buttons. Please see the example below. For each presented option, the test taker chooses YES or NO as to whether the option is perceived as correct or incorrect, respectively. When the question is answered correctly or incorrectly, additional presentations of options are rendered unnecessary.

The correct answer is (A) -4
(Source: Sample test item for Graduate Record Exam obtained from ets.org)

Here the test taker would be expected to click on the NO button as a correct response. Depending on the test taker’s response and the random selection of options, the correct option or the other incorrect options may be displayed one after the other, replacing the earlier selection each time.
For the rest of the paper, it is important for the reader to discern the difference between the term “response” and the term “option.” A correct option is one that has been written or produced by the item developer and keyed as correct. Option -4 in the example above is a correct option. The other four options in the example are incorrect options. Using the DOMC approach, a correct “response” is given by the test taker when clicking on the YES or NO buttons, after an option is presented.

Answering YES when the option -4 is displayed is a correct response, but so is answering NO when option 12 is displayed. An incorrect response would be answering NO when option -4 is displayed or by answering YES when the incorrect option 12 is displayed. Providing one or more correct responses may be required to answer the question correctly. Providing even a single incorrect response, whether by clicking the YES or NO buttons, will cause the entire question to be scored as incorrect.
The simplest version of the DOMC item shown above can be modified in many ways for psychometric or security purposes, just as the typical multiple choice item can be varied for similar reasons. The following significant variation examples are discussed below.
This allows the item developer to place a more stringent demand on the test taker: For this example, requiring two correct answers instead of one may make an item more difficult and fit a skill description better. Here is an example of the DOMC item content.
Those with the asterisk are correct options. Options are presented randomly. The item developer would set up the scoring in this way: When two correct options are eventually displayed and the test taker responds YES to both, the question would be scored as correct.
A NO response to an incorrect option is a correct response, providing information about the test taker’s competence, but probably not as much as providing a YES response to a correct option. This is because the NO response would be an indirect correct response. Nevertheless, several of these indirect correct responses can be used to reasonably score the item. Consider the prime number item above. If the test taker saw the last 4 options (all non-prime numbers and incorrect) and correctly identified them as not being prime numbers, it may be reasonable psychometrically to score the question as correct even though no prime number was actually displayed. This feature of the DOMC item utilizes more information about the test taker than does the traditional multiple choice item where a test taker’s actual response to incorrect options is not captured or known.
For some test taking populations, perhaps small children or individuals with cognitive impairments or test anxiety, the simplest version of DOMC—asking for only a single response to a single option—may remove any negative effects of repetitively presenting a number of options. This variation may require the addition of more items to the exam. This use of a DOMC item also may work well for individuals who are English language learners (if the test is in English) or individuals who may be unfamiliar with how traditional multiple choice tests work.
SUPER-DOMC items are simply items that have many more incorrect and correct options than are usually created. SUPER-DOMC items are created by supplying each question with greater numbers than usual of both correct and incorrect options. As the number of options increases the chance that test takers will see the same options decreases. Increasing the number of options improves the item’s security value and extends its long-term usefulness. For example, consider an item which requires the addition of two digit numbers:
Skill: Add two digit numbers.

It is a reasonable conclusion that such a question, particularly if all possible correct and incorrect options were part of the item, may be the only item needed to measure the skill, and the item can even be repeated to the same test taker during a test. It is likely that the SUPER-DOMC item cannot be functionally stolen or shared to help someone cheat. For these reasons, the item can be considered invulnerable and may never need to be replaced.
This is a good practice for two important reasons: (1) immediately terminating the presentation of options when a correct YES response or an incorrect NO response is provided, potentially exposes the item’s correct answer to the test taker, and (2) displaying additional non-scored options provides psychometric and measurement information concerning newly produced prospective options. As more experience is gained with the DOMC item, it is anticipated that other psychometric and security benefits may be derived from presenting additional nonscored options after the item has been answered and scored.
The DOMC item with its variations is best understood by trying it out. The reader is invited to go to trydomc.com and select Sample Tests from the menu. Brief tests covering a wide range of topics have been created. A sample test should be taken multiple times in order to learn how items and the test appear differently each time, while scoring remains consistent. When each item is presented, if you are interested, there is an Item Inner Workings button, which presents details unique to the item. Clicking on the Item Inner Working button will give you this information:
It is natural to wonder whether the DOMC format will perform as well in a test as the time-honored, centuryold traditional format. There are several ways to look at this concern. The first is whether items in the DOMC format can combine to produce a reliable test score. Confirming reliability of test scores is a strong professional standard for any exam, and score reliability is a strong feature of traditional multiple choice tests. An important study by Kingston, et al. (2012) using directly comparable conditions demonstrated that reliability was not diminished by use of DOMC and may even have been slightly enhanced. Furthermore, the researchers went on to show that using the DOMC item did not add additional complexity and was not measuring a newly introduced construct. Those results are very encouraging for using DOMC at the test level. But what is happening at the item level?
One consistent fact emerging from studies on the DOMC (Foster & Miller, 2009; Kingston, et al., 2012; Willing, 2014) is that the test becomes more difficult, reflecting the combined increase in difficulty of most of the items. The most likely explanation for this increased difficulty is that by removing the influence of test taking skills, the items and test will demonstrate a heightened, but realistic, degree of difficulty.
A review of the item discrimination statistics, such as the point biserial correlation, provides a bit more of a mystery. Overall the studies show that average item discrimination does not change much between DOMC and the traditional form; however, looking at individual items reveals idiosyncratic differences (Foster & Miller, 2009). That is, for a particular item, the discrimination value would be higher for DOMC, but lower for the traditional item, or the reverse might be seen. This effect is grist for the research mill and may lead to new ways of viewing how we write items, both traditional multiple choice and DOMC.
One research result interesting to psychometricians is that tests with DOMC items take less time to complete. This time savings has ranged from 10% (Foster & Miller, 2009) to almost 40% (Willing, et al., 2014). While the reasons for this time savings are not well understood at this time, the savings is intriguing, has obvious practical benefits, and may provide new insight into how test takers process test questions and provide responses.
Some psychometricians may be concerned that test takers do not have the same experience when the same item is presented in the DOMC format. This is true. The correct option may never be shown; options are presented randomly; some test takers may see a single option, others may see three or four. However, in every case, the experience is made more similar because the stem is always the same, and there is the same effort by the test taker to determine the answer.
Looking at this concern another way, it is also reasonable to assume that test takers do not experience the traditional multiple choice test item in a consistent way either. Despite how test developers write and present the items, it is a fact of cognitive science that test takers read and review what is presented to them differently from one another. For test questions, some may first review the options before reading the stem. Others may skim the stem first, then the options, then re-read the stem and options more carefully. Some may read the options from A to D, while others read them in the opposite direction.
My most common strategy in my test-taking days in school was to read the stem well, determine my best answer, and then run through the options quickly until I found the answer that matches the one I produced. Once I did, I selected that answer and didn’t bother to read the others. What is clear from the research on how people read and answer multiple choice test items is that no standard way of responding to a test item is followed. Due to the inherent variability in responding to traditional multiple choice test items, it is possible that the DOMC may actually provide a more standardized and controllable approach to presenting question content.
Other exciting projects in the testing field are dealing with similar issues and may shed some light on how the DOMC item should be evaluated. The concepts of Automatic Item Generation (AIG; Gierl & Haladyna, 2013) and Item Families (Geerlings, van der Linden & Glas, 2012) are based on models that use a small set of statistical properties to represent a large number of items that differ from each other in minor or major ways. Using the DOMC format can perhaps be viewed as implementing a version of AIG, creating items on-the-fly according to clear and consistent rules. Or the many ways a DOMC item appears to test takers may be considered part of an “item family.”
There is no arguing that the DOMC is a new and exciting format and that it needs an enlarging base of users and research, particularly given that it is challenging a 100-year-old entrenched method. The several studies on DOMC conducted so far are encouraging. Kingston, et al. (2012) summarizing their research, concluded that, “Based on the results of this study, there appear to be no psychometric reasons for excluding DOMC from testing programs” (p. 15). As DOMC gains in use, academicians and practitioners will provide more research-based answers to help us better utilize this new approach.







The many advantages listed in the previous section are not realized without some effort, cost and overcoming challenges. Test taker preparation is important, along with some training for item writers and changes to testing software systems. Here are a few challenges that can be expected, none of which are particularly unique to DOMC.
The DOMC is different enough from the usual format that it takes a bit of practice to gain familiarity for people who have seen a large number of multiple choice test items. However, compared to other recently introduced item types, the format is intuitive and easily learned (Foster & Miller, 2009). Practice tests using the method are easy to produce and make available.
The DOMC is not appealing to most test takers, especially those who believe they have strong multiple choice test taking skills (Foster & Miller, 2009). Providing education on the benefits of the DOMC, most of which apply directly to the test taker, can help to overcome this objection.
The DOMC format may not work well for the true “choose the best option” item type. The accurate description of this item type is where all of the options are correct, but one is more correct than the others. While many traditional items ask for the “best” answer among the options, the options are written to have one correct answer with the rest being incorrect. The true “choose the best option” question is very rare, fits few test objectives, and is very difficult to produce. A review by the author of the content of test preparation guides for the largest testing programs in the world revealed that less than 5% of items are of the “choose the best option” type, and most programs do not use it at all. If such items are in the item pool, those items should continue to be presented in the traditional format.
The DOMC format places a greater demand on a test developer to create items with options that are clearly correct or clearly incorrect (which is, of course, a standard practice even for traditional multiple choice test items). Poor item writing is never acceptable.
This is generally the case with the introduction of any new item type.
DOMC items, like items delivered in adaptive tests and other test designs, cannot be reviewed again during an exam. There is an exception to this rule. If the test taker does not respond to the item, he or she can mark the item for later review. It can then be revisited and answered at a later time.
This is due to the effort to create additional correct and incorrect options.
The return on investment (ROI) for any new innovation should be calculated or estimated using conservative assumptions and figures. Does it make sense to spend the time, effort, and money to implement DOMC into a testing program? Because the advantages are many, with some being hard to quantify, it is difficult to to anticipate what value a DOMC conversion will have to any particular program. I recommend that the reader compare the costs and challenges against the expected solutions, benefits or advantages.
The DOMC format is a relatively new format that has been designed to address some of the problems with traditional multiple choice questions, including current and emerging threats to test security. Initial studies on the item type have shown that the DOMC is appropriate in many testing situations and can enhance test security while reducing long-term testing costs. Like any format, the DOMC has some limitations (e.g., it is yet to be preferred by test takers); however, overall, it is found to be a promising alternative to traditional multiple choice testing.



