We projected the number of reviewers we’d need based on three reviewers per submission and no reviewer handling more than five submissions. Reviewers must re-opt in every year. Since they have already registered with Ex Ordo, that should be simple using the Ex Ordo communications functions. If you don’t hear from people you invite after emailing from Ex Ordo, try your work or conference email because some Ex Ordo emails get diverted to junk mail folders. That also happened occasionally with our conference GMAIL account, so maybe you want to set something up through your school or something along those lines.
We’d be happy to provide you a list of all the people we invited. You’ll probably need to add at least 100-200 names to what is in Ex Ordo already, depending on demand for ACR 2022 (which is hard to predict ahead of time anyway), because it’s much more likely that reviewers will doing a really good job if they are responsible for a small number of submissions.
We invited about 750 reviewers in early February (this does not include the 119 Program Committee members who were mainly responsible for reviewing Special Sessions, but both groups were invited at about the same time) and ended up with about 500 reviewers (plus the 119 PC members).
About a hundred reviewers declined our invitation and the others never responded to any of our invitations (whether sent out through Ex Ordo or invited in other ways afterwards, usually through the conference email or work emails). Some of the people who never replied we later learned moved to new jobs and forgot to update their emails with us, but most non-responses apparently were because people just ignored our invitations.
Once everyone has accepted, you have to provide Ex Ordo this list of reviewers, PC, AEs, and Chairs so they can be sure to label each person in the system properly (e.g. so PC members are not asked to review Working Papers and reviewers are not asked to handle Special Sessions etc…).
So how do you find new reviewers? To get invited, people had to have been a previous ACR reviewer in good standing (which basically means the prior chairs reported that they did their work on time) OR they had to be recommended personally to us by a PC member, AE or Chair, who we asked in January/February to send us names. We took these names and invited them through Ex Ordo. We tried to avoid using PhD students as reviewers but made a few exceptions (e.g. graduating with their PhD in the summer before ACR or highly recommended by someone whose advice we trusted).
After inviting a new reviewer, we often received emails saying “I cannot enter my content and methods codes”. This was almost always due to different emails being used (i.e. the invitation going to one email address; the reviewer trying to register with a different one). At the end of the day, if the issue couldn’t get sorted (after the reviewer contacted us), we just deleted them from the system and re-invited them with whatever email they told us they wanted to use. That worked 99% of the time. The only exception were a group of reviewers and authors with USC Marshall School of Business emails and we were never able to fix the issue (we did a lot of things manually with them). We don’t know what the issue was - there is something wonky with the USC email system that we could not figure out, so we had to input stuff for these reviewers manually.
Here is a sample reviewer recruitment email (sent out through Ex Ordo):
Good morning, good afternoon or good night!
This is an invitation to review for the 2021 Association for Consumer Research annual conference being held virtually in Seattle (Oct. 28-31).
We would be thrilled for you to accept.
Here’s how: click on the “Yes I accept the invitation” link below. Then as soon as possible, log into the conference dashboard, update your details and nominate your topics of interest. Please nominate as many topics and methods as you feel comfortable handling. We will use this information to send you papers to review.
This is important: please examine your email address at the top of this message and register with that one. If you use a different email address to register than the one you are receiving this email at, you will not be able to successfully register. Once in the Ex Ordo system, you can change your email to any you prefer.
If we have made any errors (e.g. name spelling), please accept our apologies and make the correction after you log in.
No thanks, don't email me again
This is a later reminder:
About a week ago, we sent you an invitation to review for the Association for Consumer Research's 2021 conference. We would really value your time in helping us to shape a great conference.
Even if you have already accepted our invitation, we also need you to choose your areas of expertise in the system. You can do so by clicking on the link below:
Choose my tracks and topics of expertise
If you wish to respectfully decline this invitation, please click on the link below (and we won't bother you with any further emails).
I respectfully decline this invitation
Thank you and all our best!
And another reminder:
Dear [[Name]],
We would be grateful for you to accept our invitation to review for ACR 2021.
It's simple: Accept our invitation here (Choose my tracks and topics of expertise) and immediately proceed to registering. If we don’t have this information, we cannot assign you reviews.
THIS IS IMPORTANT: Some people report that after accepting our invitation, they are taken to the Ex Ordo website but are not presented the content/methods codes. This is because they are registering with a different email address than the one we are using to invite them.
So please examine your email address at the top of this message and register with that one. Once in the Ex Ordo system, you can change your email to any you prefer.
If you recall already accepting our invitation and are wondering why you are receiving this message, it’s because you are not registered. Please proceed to the Ex Ordo website and register using your email address listed at the top of this message.
If you are not interested in reviewing, please decline our invitation (I respectfully decline this invitation). We thank you for your consideration and will leave you alone!
Thank you and have a great day!
Tonya, Anat and Matt
ACR 2021 Co-Chairs
PS: If you would like to learn more on how to choose your topics you can follow our step-by step guide on the link below:
https://support.exordo.com/article/455-accepting-an-invite-as-a-reviewer-what-happens-next
Notes: Reviewers sometimes don’t know or remember or see what kind of review they are doing. This is most notable for Competitive Papers vs. Working Papers, since reviewers are often assigned both. It’s important that there be an obvious way to identify each type so reviewers are clear on the standards. Based on comments back to authors, many reviewers were using standards of Competitive Papers (higher) and applying them to Working Papers (less advanced work).
So how to assign reviews? Ex Ordo has an automated algorithm that will do it for you, but that only gets you part way. We double-checked all the assignments manually and found that there were 30-40 reviewers who had not been assigned any submissions to review, and we had other reviewers who had been assigned 10+ submissions. So, after the initial allocation by the algorithm, we took a few hours and made sure no reviewer had more than 5 submissions and all reviewers had a least one submission to handle. Here’s how to do that:
Look at each submission (go to the Reviews tab and click on “list of submissions”. It’ll provide a list of all the submissions. If you only want to look at Special Sessions or Films or whatever, you can do that too).
For each submission, there will be three names, each with a number and some stars. The number is how many reviews that person is assigned; the stars are an indication of how much overlap there is between their content/methods codes and the content/methods codes of the submission. So, it’s something like three stars is 75%+ match, two stars is 50%+ match, and one star is 25%+ match (I probably have the %s wrong, but more stars is better).
So, suppose you see a reviewer name with too many submissions to review and you want to assign someone else. Click on the name you want to replace and you’ll see a box with “Recommended Reviewers”, “Search by Name or Email”, and “Invite a Reviewer”.
We found “Recommended Reviewers” not helpful, because it kept trying to assign people who already had too many submissions to handle. The algorithm apparently relies more on the stars than the number of submissions a reviewer is already handling.
We rarely used the “Invite a Reviewer” option, since it means you already know who you want to use (from memory?).
Best option: click on the “Search by Name or Email” and it will show you a list of all reviewers, sorted by stars but also showing how many reviews they already have. Go down the list and find reasonably well qualified people who don’t have too many submissions already. There’s a lot of balancing. Is it better to have 3 reviewers, with 1, 2 and 3 stars respectively? Or is it better to have 3 reviewers each with 2 stars? And so on. Those are decisions you have to make.
Here is the most important thing – one person should do all the allocations or at least, one person at a time should do the allocating. The entire allocation process is a moving jigsaw puzzle. If you make a change in one place, it has an impact on the system in other places. You probably could get away with one person allocating Special Sessions (because that is a separate review pool, involving only the PC members) and another person allocating all the Competitive Papers, Working Papers and Films (all of which draw from the same reviewer pool), but we largely had only one person handle the allocation. It’s a big job, but if you tried with different people, I suspect it would be a headache.
Before you send out the submissions to reviewers, go to the Conference Tab and download the Reviewers file in Excel. It will show you if any people who volunteered to review have not actually been assigned any submissions. You can use this to reallocate.
It’s fine to assign reviewers to submissions being handled by different Working Paper Chairs or AEs. The system can handle that.
So, a few other issues.
All submissions but the Special Sessions were double-blind. In the system, all of them were technically listed as double blind, but because we allowed Special Session authors to identify themselves in the Word submissions itself, it was not double blind.
AEs have access to the names of reviewers and authors.
People who were invited to review could decline. This is fine if early in the review process – it’s easy to reassign. But if it’s a few days before the deadline, you’ll have to pull in favors, have AEs complete the reviews themselves, beat the bushes for reviewers who can do it on short notice (e.g. PhD students you know etc…).
It has become customary for Knowledge Forums not to be reviewed (outside KF Chairs) and for submitters to receive no feedback. Submitters put a lot of work in and it’s frustrating just to be told “no we’re rejecting your work”. It’s probably a good idea to have detailed feedback for all kinds of submissions, including KF’s. Or, get used to explaining why no feedback is provided to the submitters, because they are going to email you irritated or worse.
After review assignments have gone out, it’ll be necessary to keep track of which reviewers are behind, which are done etc… There is some functionality for that in Ex Ordo, allowing you to use the software to remind reviewers who have not yet started etc…, but there is a quirk: the system will identify any submission that has at least one reviewer on it who has not started, then email ALL the reviewers assigned to that submission (even if one or two of them has already started or completed their reviews). Receiving reminders to do work you’ve already done is mildly annoying.
Here is the work around. In the CONFERENCE tab, you can export a spreadsheet that contains the status of all reviews. Use the REVIEW option (not the REVIEWERS option, which has the same problem as I’ve outlined above). It will generate the status of every review in detail. It is a large file, taking maybe 5 minutes to generate and download, but once you have it you can easily sort it and find the specific reviewers who have not started along with their email addresses. Then, outside of the Ex Ordo system (using our work emails), we contacted these reviewers and reminded them to start etc…
Overall, we provided reminders to reviewers a few ways and we developed a schedule of when reminders should go out and from whom:
One and two weeks into reviewing and three days before the deadline, we sent email reminders from Ex Ordo to all reviewers/PC members.
We encouraged AEs/Films Chairs/Working Paper chairs to email their reviewers as well. To facilitate that, we provided each AE/Chair an excel spreadsheet, based on the REVIEW option described above, containing a list of the papers they were responsible for along with review status for each. Using that, they sent out personalized emails to each reviewer. Because reviewers/PC members were assigned to submissions that were being handled by different AEs/Chairs, this meant that sometimes a reviewer would get two or more reminders.
Here and there, we the conference co-chairs also sent out emails to reviewers just to poke them. This was not that common though.
We gave reviewers about 3 weeks to complete their task. In the end, there were about 5-10 reviewers and 3-4 PC members who disappeared. Also, about 80% of the reviews were submitted in the last 48 hours before the deadline. We told the reviewers their deadline was May 31, but we actually kept it open until June 2 so we could accommodate the late arrivals. Also, because the Ex Ordo server is in Ireland and our reviewers were all around the world, it made sense to keep it open just a day or two longer to accommodate time zones etc…
Issues that that came up a lot:
Cannot access reviews: Sometimes reviewers (and AEs) could not access their reviews/submissions. This was almost always because they were using the wrong email address to log in with (as explained above). If they registered as an author with one email address and as an AE/reviewer/chair with another, it means Ex Ordo thinks they are different people. In one case, this resulted in a reviewer being assigned their own paper to review (so yes it’s a problem). You cannot consolidate these accounts yourself – you have to email Ex Ordo to do that – but you can get the person their assigned submissions to review meantime. Simply find the reviewer in the system (List of Reviewers), go to the bottom of the page and click on “Download Pack”. This will create a zip file of the submissions they are supposed to review as well as the instructions to reviewers. Send it to the reviewer. They can do their reviews offline and either (a) upload them once the access issue is solved or (b) email you the review comments/scores and you can manually enter them.
Prematurely closed reviews: When a reviewer is done a review (i.e. provided scores and comments), they are supposed to press the button at the bottom that says “Complete Review”). Sometimes they pressed this button before they were actually done, which is a problem because at this point they are locked out of the review. They’ll email you about the problem and you just go back in and manually re-open the review.
Mixed up comments: We required reviewers to provide written comments to the authors. Sometimes, because a reviewer had multiple submissions to handle, they inadvertently entered the wrong comments. There are two solutions. First (recommended), you can open up the review (you’ll see a “Re-Open Review” button), which gives the reviewer access again and they can make edits. You just have to email the reviewer after you’ve reopened the review so they know it’s possible to make edits. Second, you can have the reviewer send you the correct comments and you can manually enter them (just click on the review and you can paste comments in). In a few cases where the review was late or the reviewer was technologically challenged, we did this but it’s probably better to have reviewers do it themselves, both to reduce your workload and to ensure the reviewer (not you) is responsible for any possible mistakes.
Failed to close review: once a reviewer is done, they are supposed to click on “Complete Review”. Many forgot, and either just closed the page or pressed “Save and Close” instead. So, in the instructions to reviewers, you should provide language reminding them to use this “Complete Review” button (we forgot to do this in our language and it became a bit of a hassle). Once the review deadline is past, you can identify the reviews where this is a problem, because they all report the same “percentage complete” rate. For us, it was exactly 89% - that is, any submission where the review was listed as 89% complete was one where reviewers forgot to press that button. You can just go in manually and click on the “Save and Close” button for them. Best not to do this before the deadline passes because sometime reviewers were legitimately not done the review and I made the mistake of prematurely closing out their reviews.
No written comments: We also found other reviews where the “percentage complete” was a smaller number (I think it was exactly 79%). This corresponded to reviewers completing the rating scores for each submission but not providing written comments. We required reviewers to provide constructive feedback to authors, so if it was apparent that no comments were provided, you have a decision to make: (a) go back to reviewer and ask for comments, or (b) set aside the review, or (c) include it by manually closing out the review, keeping in mind this means authors get no feedback (as an author this is irritating, since it means they receive no feedback on their submission – we did not provide authors the reviewer scores associated with their submission). Again, probably best to wait until the review deadline has passed to do anything based on this number.
Missing and late reviews: As organizers, this is frustrating. People volunteer to be reviewers and then they disappear or are late or do shoddy work. However, we often learned that reviewers were going through horrific things in their personal lives, were doing the best they could and our interactions with them called for humanity, not stern/angry emails about missing reviews. Sometimes reviewers asked to be removed and we were gracious and just reassigned their work. Other times (close to or after the review deadline), AEs or co-chairs sometimes stepped up and completed reviews in replacement. As well, we sometimes just accepted that some small percent of submissions were only going to have 2 reviews instead of 3 etc… In the end, using combinations of reminders and friendly emails and technical assistance and offering to manually enter reviews etc…, we did get 98% of reviews submitted, but that was a lot of effort. Just assume that deadline day +/- 3 days is going to be nutty for you.
Special Sessions: We acted as AE for the special sessions. As a result, these are the first types of submissions you can make decisions on. You don’t have to wait for AEs or Chairs.
Here is the actual email we sent to reviewers (through Ex Ordo):
Thank you for agreeing to review submissions for this year’s Association for Consumer Research conference. The submissions you have been assigned are ready for review.
For each submission, we are asking you for two kinds of feedback. Both are important.
First, when you are ready to upload your review, the conference software will prompt you to answer seven simple questions (e.g. contribution, execution etc...) using 5-pt scales.
Second, you will be required to provide authors with written feedback. Please address the authors in a concrete, constructive and generous manner. Be kind. Comment on the positive aspects of their submission and provide detailed, specific suggestions that will help them improve their submission. Please write the type of review that you would be proud to sign your name to.
You will also have the option of providing the editorial team any additional feedback you would like.
You can start your reviews by logging into acr2021.exordo.com and clicking on the "Start Reviewing" button to see the submissions that have been allocated to you.
When finished each review, please mark each submission as Complete so we can move forward with it. At this point, you will not be able to make any further edits.
You can find a detailed guide on this process following the link below:
https://support.exordo.com/article/450-reviewing-my-assigned-submissions
The deadline to complete your reviews is May 28, 2021. We will be grateful if you would complete your reviews before this deadline so we can issue our notifications to authors without delay.
If you have questions, please reply to this email.
Notes:
You need to invite Knowledge Forum Chairs to register in Ex Ordo, so they can access the Knowledge Forums. It’s nearly the same process as for AE’s – invite them as reviewers, have them enter at least one content code (i.e. it doesn’t matter what they enter – the three Knowledge Forum chairs decide how to split up their workload; it’s not done by matching content/methods codes) and register, then provide those names to Ex Ordo. They’ll make sure to assign these chairs the Knowledge Forum submissions. You don’t need to invite Doctoral Symposium Chairs nor Mid-Career Mentorship Chairs. They don’t need access to the system.
In Ex Ordo, you create review criteria. These cannot vary by type of submission so you cannot, for example, create different criteria to judge Special Sessions vs. Working papers. Our solution was to come up with general criteria:
TOPIC: Does this submission tackle an interesting and important topic? (1 = not interesting & important; 5 = exceptionally interesting and important)
EXECUTION: How methodologically and/or theoretically rigorous is this submission? (1 = not rigorous, 5 = exceptionally rigorous)
CONTRIBUTION: If the authors make revisions according to the feedback you provide them (below), what is the submission’s potential to contribute to consumer research? (1 = no contribution, 5 = profound contribution)
NOVELTY: We want to identify submissions that deal with a novel marketplace phenomenon that is emergent and poised to motivate future research. Does this submission meet this criteria? (1 = definitely not novel, 5 = exceptionally novel) This is the question intended to help screen for Marketing Letters.
ACCEPTANCE: Should this submission be accepted to this conference? (1 = definitely no, 5 = definitely yes)
AWARDS: Different types of submissions are eligible to be recognized (e.g. Best Competitive Paper). Is this submission outstanding enough to be considered for an award? (1 = definitely no, 5 = definitely yes)
CONFIDENCE: As a reviewer, how confident are you within the knowledge area discussed in this submission? (1 = not at all confident, 5 = exceptionally confident)
Each reviewer was asked to answer these questions for each submission, as well as to provide the authors written feedback (they also had the option of providing AEs with written feedback and many did).
Here is the actual email we sent to reviewers (through Ex Ordo):
Thank you for agreeing to review submissions for this year’s Association for Consumer Research conference. The submissions you have been assigned are ready for review.
For each submission, we are asking you for two kinds of feedback. Both are important.
First, when you are ready to upload your review, the conference software will prompt you to answer seven simple questions (e.g. contribution, execution etc...) using 5-pt scales.
Second, you will be required to provide authors with written feedback. Please address the authors in a concrete, constructive and generous manner. Be kind. Comment on the positive aspects of their submission and provide detailed, specific suggestions that will help them improve their submission. Please write the type of review that you would be proud to sign your name to.
You will also have the option of providing the editorial team any additional feedback you would like.
You can start your reviews by logging into acr2021.exordo.com and clicking on the "Start Reviewing" button to see the submissions that have been allocated to you.
When finished each review, please mark each submission as Complete so we can move forward with it. At this point, you will not be able to make any further edits.
You can find a detailed guide on this process following the link below:
https://support.exordo.com/article/450-reviewing-my-assigned-submissions
The deadline to complete your reviews is May 28, 2021. We will be grateful if you would complete your reviews before this deadline so we can issue our notifications to authors without delay.
If you have questions, please reply to this email.
We had not intended to use all these categories as criteria to accept or reject submissions. For example, “Novelty” was added due to an initiative between ACR and Marketing Letters to help the journal’s editors identify possible submissions for their journal and “Awards” was intended to help us identify truly outstanding submissions as award candidates.
The way the system is set up now, it treats these seven criteria as equal and uses all to create an overall score for each submission. Our goal was to have AEs/Chairs focus on the Topic, Execution, Contribution, Confidence and Acceptance criteria (+ written comments). We did not want the Awards and Novelty criteria to be included (both categories drastically reduce the overall score, leading us to be concerned about the possibility of false negatives – rejecting a paper that is actually decent) but we didn’t figure out how to configure the criteria to only use a subset to calculate the overall score. Hopefully you can fix this.
Our work-around was to go into “Excel Exports” and download the “Review” file. Then, in Excel, you can rank and sort the evaluations and play around with weighting (e.g. average of Topic, Contribution, Execution and Acceptance weighted by Confidence etc…). Once reviews were all in, we provided each AE and Working Paper chair an excel file with all their submissions plus the criteria scores for each so they could use them to help make their decisions.
Finally, we opted to not provide criteria scores to authors. We worried about getting into discussions about averages and distributions etc…