IMPORTANT: Whether you have extensive experience as a science fair judge, or you are a "rookie," please review the rubric and/or 18-min judging overview video below, which includes detailed information on the judging guidelines, criteria, and helpful tips for judging effectively in our "virtual" fair!
Because this is a "virtual" fair, you do not have the benefit of interacting with the student(s) directly to ask questions and clarify your understanding of their exhibit. Virtual science fair judging requires a shift from interactive dialogue to forensic evaluation of documentation, video evidence, and ethical compliance.
The videos can be accessed online from the links sent to you in your evaluation packet email. The videos may come in the form of a narrated PowerPoint or slide presentation and may include video of the presenters. Some exhibits may feature more than one contributor per exhibit and possibly more than one video link. The total duration of all videos per exhibit from High School exhibits should be no longer 15 minutes, respectively, regardless of the number of videos per exhibit.
In addition, you will be asked to evaluate documentation from each of your selected exhibits. The documentation for each exhibit is provided in your evaluation packet email. The documentation will include a summary of the project, poster board, AI Use Logbook, and a set of responses to a standard questionnaire designed to address some of the Judging Criteria explained below.
Besides your evaluation packet email, both the video(s) and documentation for each exhibit that you evaluate will also be embedded within the Exhibit Evaluation Form. This will provide you the added opportunity to re-examine the content you were given at the same time you are rendering scores for each exhibit.
Therefore, you should allocate enough time to review the materials submitted to you from each of your selected exhibits. Set aside 30–45 minutes per exhibit minimum: 15 minutes for the video, 15+ minutes for documentation, questionnaire responses, and note-taking. This prevents rushed decisions and ensures adequate review of all evidence before scoring. Take detailed notes to capture your observations, remain objective and fair throughout, and reflect on the insights students provide.
Your goal is to feel confident that you have evaluated each project adequately, objectively, and fairly.
We recommend that you visit the Exhibit Evaluation Form before you start judging to print a blank form, which will list your specific exhibits and allow you to record your notes and interim scores during your evaluations. Please note that printing a blank ballot with the information from your assigned exhibits may result in upwards of 60 pages!
When you have completed your evaluations, you can return to the Exhibit Evaluation Form to transcribe your scores online from your written notes.
You also have the option to save your entries online along the way. You will receive a link by email to allow you to continue editing your entries after your first save. Please use this link to periodically revisit the form to make further changes or additions to your entries.
Because you will have only one opportunity to submit your scores in the online form, do NOT click on the "Submit" button unless you have completed your evaluations and are prepared to submit your final scores.
Each exhibit you judge will be based on five criteria: Creative Ability, Scientific Thought, Thoroughness, Skill, and Clarity
Is this an original idea or an original approach to a new idea? Both are good.
Does the student show ingenuity in the materials, apparatus & techniques or was the exhibit from a purchased kit?
Does the student demonstrate the ability to improvise and adapt?
Is the project a collection, or is it a purposeful one?
Does the exhibit show: organized procedures, accurate measurements and/or observations, controlled experiments, cause & effect reasoning, theories, analysis, and synthesis?
Weight should be given to the likely amount of real study and effort represented in the exhibit. The project cannot be just a demonstration or an attractive display.
How completely has the student explored or studied the problem? Was evidence gathered as data in notebooks, journals, or logbooks?
Are there bibliographies, charts, tables, and graphs?
Does the exhibit identify experimental organisms and/or apparatus?
Is the workmanship good?
Does the student show evidence of mastery of techniques?
Did the student construct his/her own apparatus?
Did the student analyze their data effectively?
Did they use statistical methods to interpret their findings?
Were the conclusions based on sound judgment, and were any unexpected findings explained?
Are links to online video presentations and documentation working?
Are the sound and image on video presentations clear and easy-too-understand?
Does the display clearly explain what was done?
Does the exhibit provide a neatly written, well-organized backboard that is easy to follow?
Things that ensure clarity are: labels, guide marks, well-written descriptions, emphasis on important items, labeled graphs, labeled tables, legends underneath graphs and tables. Does the exhibit provide these?
Students must complete a questionnaire for their exhibit as part of the fair requirements. Students answer 17 mandatory questions and select 3 out of 6 additional questions for judges to assess, covering topics such as:
Research Motivation & Personal Growth
Experimental Design & Methodological Choices
Data Quality, Accuracy & Reliability
Data Analysis & Interpretation
Critical Thinking & Problem-Solving
Replicability & Scientific Rigor
Literature Review & Research Ethics
Impact & Real-World Applications
Innovation & Unique Contributions
Communication & Mentorship
Technical Skills & Growth
Reflection & Scientific Curiosity
Abstract (500 words or less)
What question should a judge ask about your results not covered by the remaining questions, and how would you answer?
Why did you choose to investigate this research topic? How did you develop your idea, and what inspired you to do this research? How did you grow from this research experience?
What was the role of your mentor/supervisor in your research project? What was the best part of working with that person, and how did it change your insight into your research?
How did you decide how much data to collect and which variables to measure? How did you ensure the data were accurate and reliable, and what relationships did you find?
Which variable mattered most in your research, and how did you measure it? Which variables did you control, and how did you deal with any outliers, bias, or errors?
How did you analyze and interpret your data? What mathematical and/or statistical test(s) did you use to analyze your data and why? If you did NOT perform statistical analysis on your data, please describe the mathematical reasons why you didn't.
Was there an alternate methodology that you considered? If so, why did you choose this methodology over the alternatives?
What were the biggest challenges you faced in your research, how did you stay resilient, and what did you do to fix the problem?
If you repeated the research, what would you change? How would you expand it with more time or resources?
If someone else repeated your research with the same materials and scientific method, would they get the same results? Why or why not?
How did your research follow all appropriate and relevant safety and research protocols? What steps did you take to ensure your research adhered to ethical standards?
Why did you choose your sources/references, and how did your review of scientific literature help you most with your research project? How did you make sure your research was thorough and plagiarism-free?
What did you learn from your research, and what follow-up research would you suggest?
What is the importance, impact, and benefits of your findings? What, if any, are the societal, ethical, or interdisciplinary implications that could inform policy, industry practices, or ethical debates?
What is the most significant accomplishment and innovation from your research?
How can your research findings be applied to the real world in the future?
Additional Questions (Select only three to answer)
How would you explain your research in simple terms to a middle school student who wants to base their science project on it? If you were mentoring them, what guidance would you give and why?
What new skills and/or techniques did you learn and use? Explain the new skills and/or techniques, how you used them, and why they were important.
Explain how your research approach was unique or novel and how it led to better results. What makes your research different from others on this topic?
What past research supports or challenges your conclusions, and how did it shape your approach?
What are the main limitations of your results, and how do they affect your conclusions? What critique of your work would be hardest to address, and how would you respond?
What surprised you during your research, or what would have surprised you if it had happened? Why do you think that?
Some exhibits may have more than one student contributing. Do not weigh exhibits with more contributors any more or less than those with fewer contributors.
At the same time, when evaluating an exhibit with multiple contributors, you must judge the overall exhibit as a whole. Each contributor should be participating equally, albeit using different skill sets and addressing different aspects of the project. Don't evaluate the exhibit solely based on the best or worst of the contributors in that exhibit.
Again, because you do not have the benefit of interacting directly with the student(s), you will need to base your evaluation on the quality and content of the submitted video(s) and documentation. Your judgment should focus on those aspects of their project that relate to the specific judging criteria of Creative Ability, Scientific Thought, Thoroughness, Skill, and Clarity. However, you may also want to consider how you might answer other general questions based on the information provided. Below are examples to help you getting started.
Did the student(s) explain why they selected this project?
Did the student(s) describe the most interesting part of doing their project?
Did the student(s) reflect on one thing that was done in this project that made them proud?
Did the student(s) uncover anything surprising?
Did the student(s) reveal the most important thing learned from doing this project?
How did the student(s) come up with the hypothesis for the project?
How did the student(s) decide on the approach used to test their hypothesis?
Did the student(s) describe any previous work or ideas that led them to think that this project would be useful and worth pursuing.
Did the student(s) discuss what sparked their interest in this general area? A teacher? A book? A TV show?
Do you have an understanding of why they chose this particular project? Did they show a level of excitement and passion for their work?
Did the student(s) present an overview of what relationship they hoped to establish between variables?
Did the student(s) highlight a variable in the experiment and how they measured it? Did they indicate how their measurements were?
How did they decide which variables to control?
Did they indicate the biases or sources of errors in their measurements or procedures?
Did they present any lessons learned from their findings?
Which sources were used to research the topic and experiment?
How did they decide how much data were needed to collect for this project?
Did the student(s) indicate the need to change direction mid-way through the project, and if so, why? Was the reason sound? How did they change procedures and methodology to cope?
Were their measurements sufficiently precise to persuade you to believe their results?
Did the project follow all appropriate and relevant safety and research protocols?
Did the student(s) indicate how long it took to complete this project?
Did they describe the most time-consuming task?
How much was their mentor involved?
Based on the materials that you reviewed for the project, can you determine the skills that were used to complete this project?
Did the video and documentation showcase good communication skills relevant to the age of the student(s)?
Were the video and associated documentation clear and easy to understand?
Did you understand the purpose of their project?
Did you understand the key question or hypothesis addressed in this project?
Did the student(s) highlight the most important finding of their project?
Did the student(s) describe why their findings were important and how they apply to the real world?
Of course, don’t feel that it is necessary to stick with the above questions. Be creative!
Use your technical expertise and background to consider the use of Socratic questioning to extract information from the materials provided to you from the student(s).
Also consider whether the student followed standard protocols for their projects. Our HS exhibits follow proper protocol. They need to meet WESEF standards!
MOST IMPORTANT: all submitted projects that involve any living creature or human MUST be carried out under an IACUC or IRB protocol, respectively. This requirement is mandated by all peer-reviewed journals and all funding sources before a project is considered for publication or accepted for review. This requirement MUST be signed off by a teacher before the exhibit is allowed to participate in the fair.
If the project involves living creatures or humans, and it does not demonstrate or indicate compliance with this requirement of IACUC or IRB protocol, respectively, then as a judge, you should assign the lowest score (0) for Thoroughness to the exhibit; thereby, removing it from any award consideration.
Science Fair AI Policy and Documentation Requirement
Artificial intelligence (AI) tools—such as large language models, code assistants, and image generators—are valuable resources for research but raise important questions about originality, transparency, and scientific integrity. The Tri-County Science & Technology Fair permits the responsible use of AI as a research aid, provided students fully disclose how AI was used, verify AI outputs, and retain clear ownership of their work.
A formal policy and a simple AI Use Logbook are necessary to:
Ensure fairness by making all uses of AI visible to judges and reviewers.
Preserve academic integrity by distinguishing student-created ideas, analysis, and data from AI-generated content.
Support reproducibility and verification by documenting prompts, responses, edits, and validation steps.
Promote ethical awareness by requiring students to consider bias, limitations, and privacy when using AI.
Help mentors and reviewers evaluate the student’s understanding and contributions to the project.
Students must follow the accompanying guidelines and complete the AI Use Logbook whenever AI tools materially contribute to their project (e.g., literature searches, coding, analysis, figure generation, or writing assistance).
Failure to disclose or document AI use may affect project eligibility or judging. If as a judge you discover misuse of artificial intelligence, you should notify the Judges Committee immediately and assign the lowest score (0) for Skill to the exhibit; thereby, removing it from any award consideration.
Transparency & citation requirement
Students must disclose any AI tools used (LLMs, image generators, code assistants) and cite them in the abstract, logbook, and display. This disclosure must be in writing, using the attached template, and a declaration of such use must be included in any video presentation.
AI as a resource, not a substitute
AI may be used for literature searches, brainstorming, editing for grammar/clarity, or generating figures/code prototypes — but not to produce the core intellectual work, primary data, analysis, conclusions, or final report that the student claims as their own.
Student ownership and original work
The project must primarily reflect the original ideas, analysis, and interpretation of the student(s). AI-generated text or data cannot be presented as the original conclusions.
Verification and accountability
Students are responsible for verifying facts, analyses, and outputs from AI tools with reliable sources and for documenting validation steps.
Documentation of AI interactions
Maintain a logbook/scientific journal (shown below) recording prompts, AI responses used, how outputs were edited, and how AI informed decisions. Include this documentation in project paperwork.
Prohibited uses
Explicit prohibitions include: using AI to fabricate or manipulate data, to state conclusions, to write the project or abstract without attribution, or to impersonate human subjects.
Ethics / bias awareness
Students should acknowledge potential AI biases, discuss limitations of AI outputs in their project write-up, and take steps to mitigate bias where relevant.
Display and judging rules
Exhibit materials must be the students’ own wording; displays should note any AI-created images/figures and, where required, function without internet access. Judges evaluate students’ understanding of methods and AI’s role.
Consultation and approvals
Students must consult mentors/teachers and obtain Scientific Review Committees (SRC) and Institutional Review Boards (IRB) approval when AI is used in study design or in ways that affect human-subjects, data collection, or interpretation.
To be completed by the student and mentor/teacher:
Project title: _____________________________________________________________________
Student name: ___________________________________________________________________
Date(s) of AI use: ________________________________________________________________
AI tool(s) used (name + version/URL):
1. Purpose of AI use
Briefly state why AI was used: (e.g., literature search, brainstorming, code prototype, grammar/editing, image generation, data analysis help)
2. Prompt(s) given
Ø Prompt 1 (date/time):
Ø [Exact prompt text]
Ø Prompt 2 (date/time):
Ø [Exact prompt text]
etc
3. AI response(s)
Ø Response 1:
Ø [Exact AI response or excerpt]
Ø Response 2:
Ø [Exact AI response or excerpt]
etc
4. How was the AI response used or modified?
Ø Used verbatim? Yes / No
Ø If modified, describe edits and rationale:
Ø Edits made: [short list of changes]
Ø Why edited: [accuracy, clarity, style, remove errors, etc.]
5. Validation steps
Ø How did you verify the AI output: (e.g., checked against peer-reviewed sources, reran code, experimental replication, consulted mentor)?
Ø Sources used for verification (full citations or URLs):
o [Citation 1]
o [Citation 2]
6. Contribution to project
Ø What parts of the project were informed by AI: (e.g., hypothesis refinement, figure creation, code snippet, grammar edits, etc.)?
Ø What parts remained the student’s original work: (be specific)?
7. Ethical/quality considerations
Ø Identify potential biases, limitations, or risk:
Ø Steps taken to mitigate these issues:
8. Citation / acknowledgement
Ø Suggested citation text to include in abstract/display/logbook:
Ø “Portions of this work (describe) used [AI tool name, version/URL] on [date]. Prompts and responses are recorded in the project logbook.”
9. Mentor/teacher confirmation
We, [Mentor/Teacher Name] and [Student(s)Name(s)], affirm that we have read and understand the Tri-County Science & Technology High School Fair AI guidelines. We acknowledge that lying on the AI Log Book or being found to misuse AI by judges or staff will lead to disqualification.
Mentor/Teacher Signature:______________________________________________________________Date: ______________________
Student(s) Signature: __________________________________________________________________Date:______________________
Your Exhibit Evaluation Form will show the names of your selected exhibits. Below each name is an area for you to enter a score for each of the five criteria: Creative Ability, Scientific Thought, Thoroughness, Skill, and Clarity.
You should select the score for each criterion on the same integer grade scale of 0-10 using the Scoring Descriptors table below.
The Exhibit Evaluation Form will NOT allow you to submit your scores unless you have entered a score for each criterion of each of your selected exhibits.
After receiving your scores electronically, we use a separate spreadsheet to apply the following weights for each criterion based on level against your integer score:
The Overall Score from a judge is calculated on a scale of 0-100 using a weighted geometric mean of the scores given for each criterion.
The Final Rating of an exhibit, which determines its rank among other exhibits in its level and category, is based on a mean of all the Overall Scores for that exhibit, statistically adjusted (using Z-scores) for variation in scoring among all judges and compensating for especially lenient or strict judges.
The results will also be posted on this website. We will also publicize the results through local media, notify award winners by email, and send awards out by mail.
Please direct any judging questions or comments regarding this rubric to the Judges' Committee at Judges@DiscoveryCtr.org or 845-621-1260 x-402.
Remember, You are not just evaluating presentations—you are auditing research projects for scientific rigor, ethical compliance, and student understanding. Statistical analysis, repeated trials, and specific techniques must be documented in raw data, graphs, and methodology.
Documentation is your primary evidence! In a virtual setting, what's presented on video, written down, logged, and submitted is what you evaluate. Absence of evidence (logbooks, protocols, AI logs) is evidence of absence. Don't infer! Look for proof and score accordingly when it is insufficient or missing altogether.
We feel our rubric is important to follow because:
For judges: This rubric removes impression-based assessment and replaces it with verifiable, document-centered evaluation. You have clear disqualifiers, explicit criteria, and statistical protection against your own bias. You're not guessing—you're auditing.
For students: The framework rewards thoroughness and transparency over flash. A well-documented, modestly executed study with honest AI disclosure outscores a polished presentation with hidden shortcuts. The questionnaire forces reflection and articulation of reasoning, not just results.
For fairness: The weighted geometric mean + Z-score adjustment ensures that a lenient judge's scores don't unfairly boost projects, and a strict judge's scores don't unfairly penalize them.
For integrity: Hard stops on IACUC/IRB and AI compliance signal that ethics aren't negotiable. Exhibits with broken links or unintelligible audio are penalized in the criterion of Clarity. These guidelines mirror professional research standards and protects the fair's credibility.
Please proceed to the next section to learn more about the Participating Schools & Conflict of Interest.