# The ACR Match Game-1992

^{[ to cite ]:}

J. Jeffrey Inman (1993) ,"The ACR Match Game-1992", in NA - Advances in Consumer Research Volume 20, eds. Leigh McAlister and Michael L. Rothschild, Provo, UT : Association for Consumer Research, Pages: 12.

^{[ direct url ]:}

http://acrwebsite.org/volumes/7409/volumes/v20/NA-20

The Problem: Assign each of over 150 papers to three of over 200 reviewers such that each reviewer receives no more than three papers and such that the paper/reviewer "fit" is maximized across 92 content areas and 46 method areas.

Such was the task for this year's ACR conference (or, for that matter, next year's conference and most other academic conferences). In the past, the solution would consist of laying out the papers and reviewers on a large table and going through the assignment process manually. This year's ACR conference organizers asked me to attempt to automate the process somewhat. The results are described in the following paragraphs in the hope that interested readers in general and prospective authors and reviewers in particular will contribute their ideas for improvements that can be designed into the 1993 ACR. Please address suggestions to me at MC-1421, Dept of Marketing, School of Business Administration, University of Southern California, Los Angeles, CA 90089-1421. Alternatively, my telephone and fax numbers are, respectively, (213) 740-5052 and 740-7828 and my BITNET is INMAN@USCVM.

So far, only the initial stage of the process is automated (i.e., calculating the numbers of matches between reviewers and papers). The actual assignment of particular papers to particular reviewers remains a manual process. Below are key aspects of the process:

Input: In the call for papers in the December 1991 ACR newsletter, each author was asked to indicate all relevant content areas from a list of 92 content areas and 46 method areas. Prospective reviewers performed a similar task regarding their own expertise in the content and method areas.

Computer Package: Lotus 1-2-3.

Entry Format: Two p x 92 and r x 92 content matrices and two p x 46 and r x 46 method matrices were constructed, where p was the number of papers (152 this year) and r was the number of prospective reviewers (over 200). Matrix cells indicated the content or method areas that were indicated by the paper's authors and the prospective reviewers. For example, a paper/content matrix cell was coded 1 if the paper's author checked that particular content area and 0 otherwise.

Matrix Manipulation: The paper and reviewer content matrices were multiplied together, resulting in an r x p matrix. A cell in this matrix indicated the number of content matches between each reviewer and each paper. The same was done for the method matrices.

Assigning Papers to Reviewers: The two matrices were printed and it was manually ensured that: (1) each paper was assigned to 3 reviewers, (2) that the number of matches between each reviewer and each paper was as high as possible, and (3) that no reviewer was assigned more than 3 papers. This basically amounted to looking down the column for a paper and picking off the reviewers with the most matches on that paper, with constraint (3) occasionally coming into play. Content area matches were weighted more heavily than method matches because most authors and reviewers indicated very few (none in many cases) method areas. These matches were reported back to the conference chairs, who then verified the matches and mailed out the papers to the reviewers. In addition, matches were constrained such that no paper was judged by a reviewer from the same school and such that no more than one novice reviewer was assigned to a paper.

FUTURE PLANS

Optimally, one would like for the computer to handle all of the mundane work, leaving only the portion requiring intricate tradeoffs in human hands. Below are some of the changes that I would make if asked to perform this task for next year's conference:

Input: These should be redesigned and shortened. Many input categories were scarcely used, needlessly complicating both the matrix multiplication and the matching process. In particular, the list of method areas should be drastically reduced.

Matrix Manipulation: The content and method match matrices could be summed together into a single matrix with the added condition that there must be at least one reviewer/paper match on both fronts, otherwise the cell value is zero.

Assignment: The process could be transferred to Excel to leverage its integer programming capabilities and automate the "easy" initial stage of the actual reviewer/paper assignment process. The later stages, where the constraints become active (e.g., no more than 3 papers per reviewer), will probably be best done manually.

SUMMARY

By far the most important component in the process is the quality of input provided by both authors and reviewers. With over 150 papers and nearly 200 reviewers, the process of assigning prospective papers to reviewers can improve only if the content and method areas are thoroughly selected by both authors and reviewers.

----------------------------------------

Tweet
window.twttr = (function (d, s, id) { var js, fjs = d.getElementsByTagName(s)[0], t = window.twttr || {}; if (d.getElementById(id)) return; js = d.createElement(s); js.id = id; js.src = "https://platform.twitter.com/widgets.js"; fjs.parentNode.insertBefore(js, fjs); t._e = []; t.ready = function (f) { t._e.push(f); }; return t; } (document, "script", "twitter-wjs"));