Educational Guide
Post-Exam Psychometric Analysis at ABHS
“We’ve entered a new era of digital innovation — Explore how ABHS is transforming assessments with AI and advanced technologies.”
Post-Exam Psychometric Analysis at ABHS
Data with Purpose – A Guide to Post-Exam Psychometric Analysis in ABHS
Introduction
This guide is designed to support assessment leaders, scientific council members, and exam committees within the Arab Board of Health Specializations (ABHS) in understanding and applying post-exam psychometric analysis to improve the quality, fairness, and effectiveness of high-stakes examinations.
Psychometric analysis is a critical tool in the ABHS assessment cycle, not only for technical accuracy but also for fostering transparency, accountability, and regional consistency.
Section 1: What is Psychometric Analysis?
Psychometrics refers to the measurement of knowledge, skills, and competencies through scientifically validated methods. In ABHS, psychometric analysis is conducted after each exam to evaluate question quality and overall exam performance.
Key Components:
Item Difficulty (P-value): Shows how challenging each question was.
Discrimination Index: Identifies how well an item separates strong vs. weak candidates.
Reliability (KR-20 / Cronbach’s Alpha): Measures internal consistency of the test. Target: ≥ 0.85.
Distractor Analysis: Reviews effectiveness of incorrect options.
Pass Rate Trends: Examines outcome consistency across specialties and countries.
OSCE-Specific Metrics: Evaluates rater consistency, timing, and task clarity per station.
Section 2: How is Psychometric Data Used at ABHS?
Psychometric data is processed by the Assessment Unit using specialized software and shared in structured reports with:
Scientific councils
The Executive Committee
Measurement and Exam committee
Each report includes visual dashboards, item-level analysis, and benchmarks to support decision-making.
Section 3: Practical Applications
Item Bank Quality Improvement
Revise or discard low-quality items.
Build a high-performing, reusable item bank.
Scientific Council Feedback
Use category performance to align curriculum and training.
Blueprint Validation and Adjustment
Ensure proportional representation of domains.
Pass Score Decisions
Apply statistical methods (Angoff, Borderline Regression) supported by psychometric data.
Candidate and Center Feedback
Share aggregate strengths and weaknesses.
Cross-ABHS training Centers Benchmarking
Identify discrepancies across centers or countries.
Quality Assurance and Auditing
Submit psychometric summaries to central governance.
Exam Security and Anomaly Detection
Spot score patterns that may indicate irregularities.
Section 4: Recommendations for Implementation
Before the Exam:
Ensure each item is pre-tagged to domains.
Maintain blueprint alignment.
Immediately After the Exam:
Conduct automated scoring and reliability checks.
One Week Post-Exam:
Review the full psychometric report.
Convene item review panels.
Within One Month:
Adjust the blueprint or item bank as needed.
Provide summarized feedback to training programs.
Conclusion
Post-exam psychometric analysis is more than a statistical exercise—it is a strategic tool that empowers ABHS to ensure assessments are valid, equitable, and improvement-oriented. Scientific councils and examiners are encouraged to engage with these reports not just to measure, but to evolve the quality of assessment and training in the Arab region.
Prepared by the ABHS Assessment Unit
For internal training and capacity building use