Note: Once you Pass the Quiz >=75%, print the certificate, or Screenshot & attach it, and register here to obtain a verified skill certificate.
Module 1: Introduction and Purpose
What is this? This training module outlines the key aspects of the "Draft Framework for Accreditation and Ranking of Colleges Regulated by National Medical Commission (NMC)."
Issuing Authority: National Medical Commission (NMC), Government of India, specifically the Coordination Division and the Medical Assessment and Rating Board (MARB).
Objective of the Framework:
To carry out accreditation and rating of all Medical Colleges regulated by the NMC.
This process will be conducted through an independent third-party agency.
The framework aims to assess colleges based on defined criteria and parameters.
Current Status (as per Public Notice): The draft framework is in the public domain for seeking comments and suggestions from stakeholders.
Overall Structure: The framework is built upon 11 Key Criteria and a total of 78 Parameters, with a total allocated weightage of 1000 points.
Module 2: Core Assessment & Rating Criteria (Overview & Weightage)
The framework outlines 11 distinct criteria for assessing medical colleges. Each criterion has a specific focus and allocated weightage:
Criterion 1: Curriculum Implementation and Capacity Building Activities (Weightage: 100)
Focus: Implementation of Competence-Based Curriculum (UG Program), elective courses, functioning of Academic Council/Curriculum Committee/MEU, faculty development programs (FDPs), and collaborations.
Criterion 2: Clinical Exposure, Clinical Training, Internship and Clinical Facilities (Weightage: 100)
Focus: Hands-on experiences, availability of clinical materials, patient loads (OPD/IPD), laboratory/radiological investigations, and internship provisions.
Criterion 3: Teaching –Learning Environment: - Physical, Psychological & Occupational (Weightage: 130)
Focus: Adequacy of Central Library, practical labs, skill labs, digital/audio-visual facilities, safety measures (anti-ragging, gender harassment, fire, BMW, HCAI), and student amenities (sports, hostel).
Criterion 4: Students’ Admission, Attainment of Competence & Progression (Weightage: 140)
Focus: Competency attainment (CBME), assessment of student competencies (skill lab/clinical setting), NEET-UG/PG scores of admitted students, and student progression to higher medical education.
Criterion 5: Human Resource & Teaching-Learning Process (Weightage: 160)
Focus: Faculty numbers (vis-à-vis intake), attrition rates, additional qualifications, academic presentations (faculty/students), contribution to course materials, and fellowship awards.
Criterion 6: Assessment Policy: Formative, Internal & Summative Assessment (Weightage: 60)
Focus: Formative, internal, and summative assessments vis-à-vis competence-based curriculum, use of logbooks, and remedial instruction.
Criterion 7: Research Output & Impact (Weightage: 100)
Focus: Research papers in indexed journals, citations, impact factors, funded research projects, and patents.
Criterion 8: Financial-Resource: Recurring & non-recurring expenditures (Weightage: 100)
Focus: Expenditure on consumables, books/journals, sports facilities, faculty development, salaries, and maintenance of equipment, indicating effective teaching-learning organization.
Criterion 9: Community Outreach Programs (Weightage: 40)
Focus: Implementation of Family Adoption Programme (FAP), diagnostic camps, and follow-up activities.
Criterion 10: Quality Assurance System (Weightage: 30)
Focus: Accreditation of laboratories/hospitals, compliance with SOPs, safety measures, and functioning of committees (Pharmacovigilance, AMS).
Criterion 11: Feedback & Perception of Stakeholders (Weightage: 40)
Focus: Feedback from students, staff, and alumni regarding facilities, training quality, work conditions, and overall satisfaction.
Module 3: Understanding Parameters, Weightages, and Scoring
Parameters: Each of the 11 criteria is broken down into specific parameters (total 78).
Qualitative Parameters: 26
Quantitative Parameters: 52
Allocation of Weightages:
The document (Pages 10-20) details the specific weightage allocated to each parameter under its respective criterion.
It also indicates whether each parameter is Qualitative or Quantitative and provides placeholders for Performance Levels (1-4).
Operational Definitions & Scoring Rubrics:
For each parameter, there's an operational explanation detailing what is being assessed.
Specific scoring rubrics are provided, typically categorizing performance into Level-1, Level-2, Level-3, and Level-4, with clear descriptors for each level.
Supporting documents required for verification are also listed.
Many parameters involve normalization formulas to ensure fair comparison across colleges.
Verification Process: This involves:
Desktop assessment of information.
Physical visits.
Interaction with sampled faculty, students (across different professional years), and interns.
Review of documents, records, logbooks, and physical facilities.
Module 4: Key Areas of Detailed Assessment (Examples from Rubrics)
Curriculum Implementation (Criterion 1): Alignment with NMC competencies, planning of electives, functioning of college council/MEU, FDP completion, MOUs.
Clinical Exposure (Criterion 2): Provision of clinical postings, OPD attendance, bed occupancy, number of surgeries, radiological/lab investigations.
Student Competency (Criterion 4): Demonstration of procedures by students in skill labs/clinical settings.
Faculty Resources (Criterion 5): Teaching methods, faculty numbers, staff attrition, awards, publications.
Assessment (Criterion 6): Regular internal assessments, use of formative methods, logbooks, review of student performance.
Research (Criterion 7): Publications, citations, impact factors, patents, funded projects.
Financials (Criterion 8): Spending on books, lab consumables, equipment maintenance, salaries, safety measures.
Safety & Infrastructure (Criterion 3): Library, labs, hostel facilities, anti-ragging, BMW management, fire safety, AERB compliance.
Module 5: Call for Stakeholder Engagement
Action Required: All stakeholders (Deans/Principals of Medical Colleges, and others) are requested to submit their comments/suggestions on this Draft Framework.
Submission Method: Through an online form.
Link: https://forms.gle/nuk5fJgnxyKQZudk9
Deadline: Within 21 days from the date of publication of the notice (Notice dated 10-05-2025). This means submissions are due by approximately May 31, 2025.
Contact (for queries, implied by signature): Raghav Langer, SECRETARY, NMC (r.langer@ias.nic.in).
First list as Summary
Document Title: Draft Framework for Accreditation and Ranking of Colleges Regulated by National Medical Commission (NMC).
Issuing Body: National Medical Commission (NMC), Government of India.
Purpose: To establish a standardized framework for accrediting and ranking medical colleges in India, through an independent third-party agency, to ensure quality and promote excellence in medical education. The draft is currently open for public/stakeholder feedback.
Core Structure:
11 Assessment & Rating Criteria
78 Parameters (26 Qualitative, 52 Quantitative)
Total Allocated Weightage: 1000 points
List of 11 Criteria:
Curriculum Implementation and Capacity Building Activities
Clinical Exposure, Clinical Training, Internship and Clinical Facilities
Teaching –Learning Environment: - Physical, Psychological & Occupational
Students’ Admission, Attainment of Competence & Progression
Human Resource & Teaching-Learning Process
Assessment Policy: Formative, Internal & Summative Assessment
Research Output & Impact
Financial-Resource: Recurring & non-recurring expenditures
Community Outreach Programs
Quality Assurance System
Feedback & Perception of Stakeholders
Assessment Methodology: Involves detailed operational definitions, scoring rubrics (typically Level 1-4) for each parameter, verification through document review, physical inspection, and interaction with faculty and students.
Call to Action for Stakeholders:
Submit Comments/Suggestions: On the draft framework.
Deadline: Within 21 days from May 10, 2025 (i.e., by approx. May 31, 2025).
Submission Link: https://forms.gle/nuk5fJgnxyKQZudk9
Module 1: Introduction and Purpose - Understanding the NMC's Draft Framework for Medical College Accreditation & Ranking
1.1. What is this Framework? - The Foundation for Quality Assessment
Official Title: The document under review is formally titled the "Draft Framework for Accreditation and Ranking of Colleges Regulated by National Medical Commission (NMC)."
Nature of the Document:
It is a "Draft," meaning it is a preliminary version or a proposal. It is not a finalized or legally binding document in its current state. Its purpose is to solicit input and suggestions before being finalized.
It is a "Framework," indicating it provides a structured approach, a blueprint, or a set of guidelines and principles for the assessment process. It lays down the overarching methodology, criteria, and parameters.
Scope of Application: This framework is intended for all Medical Colleges that are regulated by the National Medical Commission (NMC) across India. This includes both government and private institutions offering medical education under the purview of the NMC.
Primary Goal: The overarching goal is to establish a standardized, transparent, and robust system for:
Accreditation: Formally recognizing medical colleges that meet predefined quality standards in medical education and infrastructure.
Ranking: Systematically evaluating and positioning medical colleges relative to each other based on their performance against the framework's criteria. This aims to foster healthy competition and provide stakeholders with comparative insights.
Significance: This framework represents a significant step towards enhancing the quality of medical education in India. By standardizing the evaluation process, it aims to ensure that all medical graduates possess the necessary competencies and that medical institutions maintain high standards of teaching, clinical training, research, and infrastructure.
1.2. The Issuing Authority - Who is Behind This Initiative?
The framework is an initiative of the Government of India, spearheaded by the National Medical Commission (NMC) and its constituent bodies.
National Medical Commission (NMC):
The NMC is the apex regulatory body for medical education and medical professionals in India. It was established by the National Medical Commission Act, 2019, replacing the erstwhile Medical Council of India (MCI).
Its key mandates include:
Maintaining high standards of medical education.
Regulating medical institutions and professionals.
Ensuring the availability of adequate and high-quality medical professionals.
Promoting ethical medical practice.
Overseeing the assessment and rating of medical institutions.
The public notice (CDN-20011/65/2025-Coord-NMC) clearly indicates the NMC's central role.
Medical Assessment and Rating Board (MARB):
The MARB is one of the four autonomous boards constituted under the NMC Act, 2019.
As stated in the public notice, the MARB is exercising its powers conferred under Section 26(1)(d) of the National Medical Commission Act, 2019. This specific section empowers the MARB to:
"determine the procedure for assessing and rating the medical institutions for their compliance with the standards laid down by the Undergraduate Medical Education Board or the Postgraduate Medical Education Board, as the case may be" and
"grant permission for establishment of a new medical institution or to start any postgraduate course or to increase number of seats, based on the norms determined by the Undergraduate Medical Education Board or the Postgraduate Medical Education Board, as the case may be."
Therefore, the MARB is the primary body within the NMC responsible for developing and overseeing the implementation of this accreditation and rating framework. Its involvement ensures a specialized focus on assessment and quality assurance.
Coordination Division (NMC):
The Coordination Division of the NMC, as indicated by the file number on the public notice, is involved in the administrative and procedural aspects of this initiative.
This division likely plays a role in:
Facilitating communication between different boards and stakeholders.
Managing the public consultation process.
Ensuring logistical support for the development and eventual implementation of the framework.
Consolidating efforts and ensuring a unified approach from the NMC.
1.3. The Core Objective of the Framework - Why is this Being Done?
The framework has several intertwined objectives, all aimed at enhancing the quality and accountability of medical education in India.
To Carry-Out Accreditation & Rating:
Accreditation: This involves a formal, peer-reviewed process to assess whether a medical college meets specific, predefined minimum standards of quality and performance. It is often a prerequisite for recognition and operation. Successful accreditation signifies that an institution provides a credible and adequate educational experience.
Rating (and subsequent Ranking): Beyond meeting minimum standards, rating involves a comparative evaluation of institutions based on a broader set of qualitative and quantitative parameters. This allows for differentiation among colleges, highlighting strengths and areas for improvement, and can lead to a formal ranking system.
For All Medical Colleges Regulated by the NMC:
The intent is to create a unified system applicable to every medical college under the NMC's jurisdiction. This ensures consistency and a level playing field, regardless of whether a college is publicly or privately funded, new or established.
Through an Independent Third-Party Agency:
Crucial for Objectivity: The decision to conduct the accreditation and rating process through an independent third-party agency is highly significant. This aims to:
Ensure impartiality and eliminate potential biases or conflicts of interest that might arise if the NMC itself (or its direct arms) conducted all aspects of the assessment.
Enhance the credibility and trustworthiness of the assessment outcomes, both nationally and internationally.
Leverage specialized expertise in assessment methodologies, data analysis, and quality assurance processes that a dedicated agency might possess.
This is a common best practice in quality assurance frameworks worldwide.
Based on Defined Criteria and Parameters:
The framework is not arbitrary. It is built upon a systematic structure of clearly defined assessment areas.
This structured approach ensures:
Transparency: Colleges will understand how they are being evaluated.
Consistency: The same yardsticks will be applied to all institutions.
Comprehensiveness: The criteria aim to cover all critical aspects of medical education and institutional functioning.
1.4. Current Status and Call for Stakeholder Engagement - A Collaborative Approach
"Draft" Stage: As emphasized, the framework is currently a draft. This means it is a work in progress and open to modification based on feedback.
Public Domain for Comments/Suggestions: The NMC has placed this draft in the public domain specifically to seek comments and suggestions from stakeholders.
Stakeholders in this context include, but are not limited to:
Deans/Principals and faculty of Medical Colleges.
Medical students and interns.
Medical professional bodies and associations.
Healthcare industry representatives.
The general public and patient advocacy groups.
This consultative process is vital for:
Incorporating diverse perspectives and practical insights.
Identifying potential challenges or ambiguities in the draft.
Enhancing the framework's relevance, acceptability, and effectiveness.
Ensuring a more democratic and participatory approach to policy-making.
Submission Process:
Method: Comments are to be submitted through a designated online form.
Link: The specific Google Forms link provided is https://forms.gle/nuk5fJgnxyKQZudk9.
Deadline: Submissions must be made within 21 days from the date of publication of the public notice. The notice is dated 10-05-2025. Therefore, the indicative deadline for feedback would be around May 31, 2025.
Signatory: The notice is signed by Raghav Langer, Secretary, NMC, indicating the official backing of the NMC secretariat for this consultative process.
1.5. Overall Structure of the Framework - A Glimpse into the Assessment Mechanism
Foundation of Assessment: The entire framework is built upon:
11 Key Criteria: These are the broad thematic areas under which medical colleges will be evaluated. Examples (as seen in the index and summary tables) include Curriculum Implementation, Clinical Exposure, Human Resources, Research Output, etc.
78 Parameters: Each of the 11 criteria is further broken down into a total of 78 specific, measurable indicators or parameters. These parameters can be either:
Qualitative (26 parameters): Assessing aspects that are descriptive, process-oriented, or based on observation and expert judgment.
Quantitative (52 parameters): Assessing aspects that can be measured numerically or through data points.
Weightage System:
A total allocated weightage of 1000 points is distributed across these criteria and parameters. This quantitative scoring system will form the basis for the rating and subsequent ranking of the medical colleges.
The distribution of these 1000 points reflects the relative importance assigned by the NMC to different aspects of medical education and institutional performance.
Module 1 Summary and Transition:
This introductory module has established the context, purpose, and authority behind the Draft Framework for Accreditation and Ranking of Medical Colleges. We've learned that this NMC-led initiative, driven by the MARB, aims to create a standardized, objective (via a third-party agency), and comprehensive system for evaluating all medical colleges in India. The framework is currently in a draft stage, inviting crucial stakeholder feedback to refine its 11 criteria and 78 parameters, which together carry a total weightage of 1000 points.
The subsequent modules will delve deeper into each of these 11 criteria, their constituent parameters, and the specific rubrics proposed for their assessment.
Module 2: Core Assessment & Rating Criteria – The Pillars of Evaluation (Overview & Weightage)
This module provides a detailed overview of the 11 distinct Assessment & Rating Criteria that form the backbone of the NMC's Draft Framework. Understanding each criterion's focus, its constituent parameters (qualitative vs. quantitative), and its allocated weightage is crucial for medical colleges to prepare for the accreditation and ranking process. The total weightage for all criteria sums up to 1000 points.
2.1. Criterion 1: Curriculum Implementation and Capacity Building Activities
Total Allocated Weightage: 100 points
Number of Parameters: 06
Qualitative: 3
Quantitative: 3
Operational Explanation (Focus):
This criterion is central to evaluating how effectively medical colleges are implementing the Competence-Based Curriculum (CBC) prescribed by Medical Regulators for the UG Program.
It assesses the process aspects of curriculum delivery, including the integration of theory, practical, and clinical experiences aligned with specified competencies.
It examines the mechanisms for capacity building among faculty to deliver this curriculum effectively.
Key Areas of Assessment (as per operational explanation & parameter list):
Implementation of Competency-Based Curriculum: Verifying alignment with program-specific competencies laid down by NMC, including horizontal and vertical integration.
Elective Courses: Planning and offering of elective courses within the scope defined by NMC, allowing students to explore areas of interest.
Functioning of Academic Governance Bodies:
Academic Council/College Council: Its constitution and role in curriculum planning, training program details, discipline enforcement, and academic matters.
Curriculum Committee (CC): Its role in ensuring the implementation and monitoring of the curriculum.
Medical Education Unit (MEU): Its constitution and role in conducting Faculty Development Programs (FDPs) based on training needs.
Faculty Development Programs (FDPs):
Tracking faculty participation in NMC-recognized FDPs like BCME (Basic Course in Medical Education), CISP (Curriculum Implementation Support Programme), and ACME/FIME (Advance Course in Medical Education/Fellowship in Medical Education).
Collaborations and MOUs:
Number and nature of collaborations/MOUs with national and international academic or research institutions over the past 2 years.
Assessing the outcomes of these collaborations (e.g., research projects, workshops, faculty/student exchange).
Significance: This criterion underscores the NMC's emphasis on a modern, competency-driven medical education system and the institutional mechanisms required to support it.
2.2. Criterion 2: Clinical Exposure, Clinical Training, Internship and Clinical Facilities
Total Allocated Weightage: 100 points
Number of Parameters: 11
Qualitative: 1
Quantitative: 10
Operational Explanation (Focus):
This criterion evaluates the practical, hands-on clinical experience provided to medical students, which is fundamental to their training.
It assesses the availability and adequacy of clinical materials, patient load, and facilities in various specialties within the attached teaching hospital(s).
The focus is on ensuring students receive sufficient exposure to real-world clinical scenarios to develop diagnostic and therapeutic skills.
Key Areas of Assessment:
Provision of Clinical Exposure: Ensuring students receive mandatory hands-on experience through clinical postings in diverse specialties (Medicine, Surgery, Community Medicine, OBG, etc.) and varied healthcare settings (including RHC/UHC).
Patient Load Indicators:
OPD attendance (specialty/clinical department wise).
Bed Occupancy Percentage (specialty wise).
Number of minor and major surgeries performed.
Daily patient admission/attendance in Casualty/Emergency Department.
Number of patients treated in Intensive Care Areas/High Dependency Units.
Number of deliveries (normal & C-Section).
Investigative Facilities & Usage:
Number of laboratory-based investigations carried out.
Number of radiological investigations performed.
Internship Facilities: Provision for internship training, including postings in community settings.
Significance: Directly reflects the quality of practical training, which is critical for producing competent medical practitioners. The emphasis on patient load assumes that higher patient numbers offer richer learning opportunities.
2.3. Criterion 3: Teaching –Learning Environment: - Physical, Psychological & Occupational
Total Allocated Weightage: 130 points
Number of Parameters: 11
Qualitative: 11
Quantitative: 0
Operational Explanation (Focus):
This criterion aims to assess whether the medical college provides a conducive and supportive environment for teaching and learning, encompassing physical infrastructure, psychological well-being, and occupational safety.
It aligns with the "Establishment of Medical Institutions, Assessment & Rating Regulations, 2023," emphasizing a holistic environment.
Key Areas of Assessment:
Library Facilities: Adequacy, functionality, and optimum utilization of physical/digital library resources.
Laboratory Facilities: Adequacy and optimum utilization of student practical laboratories and Clinical & Procedural Skill Laboratory (including simulated settings).
ICT and MET Unit: Adequacy and utilization of ICT facilities and Medical Education Technology (MET) Unit for digital teaching-learning.
Student Amenities:
Provision and utilization of indoor/outdoor sports facilities and organization of sports/cultural programs.
Hostel accommodation capacities and associated safety measures.
Safety and Regulatory Compliance:
Provisions for prevention of ragging and gender harassment.
Provisions for Biomedical Waste (BMW) Management.
Provisions for Hospital Infection Control Measures (HCAI).
Provisions for safety measures for Diagnostic Radiology/Radiotherapy (AERB compliance).
Provisions for Fire Safety in campus (teaching block, hospital block, hostel block).
Significance: Highlights the importance of a safe, well-equipped, and supportive campus environment for effective learning and student well-being. All parameters here are qualitative, suggesting an assessment based on adequacy, functionality, and compliance rather than just numbers.
2.4. Criterion 4: Students’ Admission, Attainment of Competence & Progression
Total Allocated Weightage: 140 points
Number of Parameters: 06
Qualitative: 2
Quantitative: 4
Operational Explanation (Focus):
This criterion assesses aspects related to student intake quality, their attainment of prescribed competencies, and their progression to higher levels of medical education or professional practice.
It links directly to the effectiveness of the CBME and the overall quality of the educational program.
Key Areas of Assessment:
Attainment of Competence:
Demonstration of procedures by sampled students in Procedure & Clinical Skill Laboratory/Simulated Setting.
Demonstration of clinical procedures/clinical skill competency by sampled students/interns at clinical sites (Hospital).
Quality of Admitted Students (Proxy Measures):
Average NEET-UG scores/ranks of admitted students (unreserved category) over the last 5 academic calendars.
Average NEET-PG scores/ranks of UG alumni (minimum cut-off qualified, unreserved category) in the previous year.
Student Progression:
Number of UG alumni taking admission in PG courses under AIQ (MCC) and State Counselling in the previous year.
Performance in Examinations: Performance of students in Summative Assessment/Exit Examination in the last academic year.
Significance: This is a high-weightage criterion, reflecting the importance of student outcomes. NEET scores are used as proxies for college reputation and teaching quality.
2.5. Criterion 5: Human Resource & Teaching-Learning Process
Total Allocated Weightage: 160 points
Number of Parameters: 11
Qualitative: 2
Quantitative: 9
Operational Explanation (Focus):
This criterion, with the highest weightage, focuses on the quality, quantity, and engagement of faculty, as well as student participation in the academic process.
It evaluates the resources and processes that directly contribute to the teaching-learning experience.
Key Areas of Assessment:
Faculty Strength and Stability:
Programmed wise number of recruited faculty staff vis-à-vis regulatory specifications.
Staff attrition rate in the past 2 years.
Faculty Qualifications and Expertise:
Percentage of faculty (Professor, Associate Professor, Assistant Professor) with additional professional qualifications (beyond minimum NMC requirements).
Teaching-Learning Methods:
Methods employed by sampled faculties in theory classes.
Methods employed by faculties for practical/clinical sessions in laboratory/simulated setting/bedside teaching.
Academic Engagement and Output:
Number of prestigious awards/grants (International/National/State) availed by students and faculty.
Number of extra/co-curricular student awards (UG students).
Faculty contribution to designing course materials (online & offline) on recognized platforms.
Number of paper presentations by faculty and academic presentations by students in recognized conferences/competitions.
Significance: Recognizes that faculty are the most critical resource in medical education. Their qualifications, teaching skills, stability, and academic contributions are paramount.
2.6. Criterion 6: Assessment Policy: Formative, Internal & Summative Assessment
Total Allocated Weightage: 60 points
Number of Parameters: 04
Qualitative: 4
Quantitative: 0
Operational Explanation (Focus):
This criterion predominantly deals with the assessment practices of the college, focusing on formative, internal, and summative assessments within the competence-based curriculum framework.
It examines adherence to guidelines (e.g., CISP-2019, Logbook guideline-2019, GMER-2019 & 2023).
Key Areas of Assessment:
Regularity and Alignment of Internal Assessments (IA): Conducting periodical IA examinations for theory & practical/clinical as per NMC guidelines for CBA.
Usage of Formative Assessment Methods: Application of formative assessment methods vis-à-vis Continuous and Comprehensive Assessment processes.
Log Books & Portfolio: Utilization of log books and portfolios for tracking student learning progress in clinical skills/competencies.
Analysis and Remediation: Department-wise analysis and review of students’ performance in formative & internal assessments and the implementation of corrective/remedial actions.
Significance: Emphasizes the importance of robust assessment policies that not only evaluate but also guide student learning and ensure competency attainment.
2.7. Criterion 7: Research Output & Impact
Total Allocated Weightage: 100 points
Number of Parameters: 08
Qualitative: 0
Quantitative: 8
Operational Explanation (Focus):
This criterion assesses the research productivity and impact of the medical institution, aligning with the rating parameter "The research output of the medical institution that has contributed to the existing knowledge and the research impact created by the medical institution" (Assessment and Rating Regulations-2023).
It considers both the quantity and quality of research activities.
Key Areas of Assessment:
Publications:
Total number of research paper publications by faculty staff (with institutional affiliation) in indexed journals in the last 2 years.
Cumulative citation scores of these research papers.
Cumulative impact factors of all publications from the institute in indexed journals in the last 2 years.
Patents and Commercialization:
Number of patents/design registrations filed by the institution.
Number of patents granted, converted to products, and commercialized.
Funded Research Projects:
Number of extramural funded projects (completed/ongoing) in collaboration with Industry/Non-government (National, State/International) funding agencies in the last 2 Financial Years (differentiated by funding amount in some rubrics, likely a typo and should be one parameter).
Clinical Trials: Number of clinical trials initiated/ongoing/approved for different phases in the last 2 calendar years.
Significance: Highlights the role of medical colleges in advancing medical knowledge and innovation. All parameters are quantitative, focusing on measurable research outputs.
2.8. Criterion 8: Financial-Resource: Recurring & non-recurring expenditures
Total Allocated Weightage: 100 points
Number of Parameters: 10
Qualitative: 0
Quantitative: 10
Operational Explanation (Focus):
This criterion uses financial expenditure as a proxy for the effectiveness of the teaching-learning process and clinical training.
It assesses how financial resources are allocated to support various academic, infrastructural, and operational needs, aligning with UGMEB standards.
Key Areas of Assessment (Expenditure in the previous financial year on):
Procurement of books & journals and other learning resources.
Procurement of consumable lab-based materials.
Maintenance of radiological equipment.
Procurement of non-consumable equipment in clinical laboratories (teaching hospital).
Consumable resources for indoor & outdoor sports.
Salary for faculty staff and residents.
Percentage of electricity obtained from renewable energy (solar/wind).
Procurement of consumable materials for clinical/operational works in OT.
Maintenance of non-consumable equipment in OT.
Strengthening of safety measures in campus.
Significance: While indirect, financial investment is seen as an indicator of an institution's commitment to providing necessary resources for quality education and patient care.
2.9. Criterion 9: Community Outreach Programs
Total Allocated Weightage: 40 points
Number of Parameters: 02
Qualitative: 1
Quantitative: 1
Operational Explanation (Focus):
This criterion evaluates the medical college's engagement in community outreach activities, particularly the Family Adoption Programme (FAP), which is an essential component of the curriculum (GMER-2023).
Key Areas of Assessment:
Family Adoption Programme (FAP):
Number of families adopted by students.
Organization of diagnostic camps in adopted villages for screening & identification of disease/ill-health & malnutrition.
Impact of FAP: Impact of family adoption/therapeutic intervention on the health outcomes of the adopted family (qualitative assessment).
Significance: Reflects the college's commitment to social responsibility and providing students with exposure to community health issues.
2.10. Criterion 10: Quality Assurance System
Total Allocated Weightage: 30 points
Number of Parameters: 05
Qualitative: 2
Quantitative: 3
Operational Explanation (Focus):
This criterion assesses the formal Quality Assurance System (QAS) in place at the medical college.
It includes accreditations by recognized bodies and the implementation of internal quality control mechanisms.
Key Areas of Assessment:
Accreditations:
Accreditations of Clinical Laboratories by NABL or nationally recognized body.
NABH Accreditation of parent/attached hospital.
Regulatory Compliance: Legal Licenses (availability & validity as per NMC guidelines).
Internal Committees and Systems:
Functioning of Pharmacovigilance Committee.
Constitution and functioning of Antimicrobial Stewardship (AMS) Committee.
Significance: Emphasizes the importance of established systems and external validations to ensure and continuously improve quality.
2.11. Criterion 11: Feedback & Perception of Stakeholders
Total Allocated Weightage: 40 points
Number of Parameters: 04
Qualitative: 0
Quantitative: 4 (though the "perception" aspect implies qualitative data collection feeding into quantitative scores)
Operational Explanation (Focus):
This criterion focuses on capturing feedback and perceptions from various stakeholders (students, staff, alumni, patients) regarding the quality of the medical college and its facilities.
It aims to understand the effectiveness of training and the overall institutional experience from different perspectives.
Key Areas of Assessment:
Student Feedback: Feedback from sampled students & Inspiration Index (likely a composite score from student responses).
Faculty Feedback: Feedback from sampled Faculty & Loyalty Index (likely a composite score from faculty responses).
Alumni Perception: Perception of Alumni towards the quality of the Institution.
Patient Perception: Perception of Patients towards Health Care Services.
Significance: Acknowledges that stakeholder perception is a valuable indicator of institutional quality and the impact of its educational and healthcare services.
Module 2 Summary and Transition:
This module has provided a comprehensive overview of the 11 core criteria that will be used to assess and rate medical colleges under the NMC's draft framework. We've explored the specific focus of each criterion, the types of parameters it includes (qualitative/quantitative), and the total weightage it carries in the overall 1000-point system. This foundational understanding is essential before moving on to the detailed operational definitions and scoring rubrics for individual parameters within each criterion.
The subsequent modules will delve into these specific parameters and their scoring mechanisms, offering a more granular view of the assessment process.
Module 3: Understanding Parameters, Weightages, and Scoring – The Mechanics of Evaluation
This module delves into the core mechanics of how medical colleges will be assessed under the NMC's Draft Framework. It focuses on three critical components: Parameters (the specific indicators of performance), Weightages (the relative importance assigned to these indicators), and the Operational Definitions & Scoring Rubrics (the detailed guidelines for measurement and evaluation). A thorough understanding of these elements is essential for interpreting the framework and preparing for the assessment process.
3.1. Parameters: The Specific Indicators of Performance
Definition and Role:
Parameters are the most granular level of assessment within the framework. They are specific, measurable (either qualitatively or quantitatively) indicators that, when taken together, provide evidence for a college's performance against a broader criterion.
Each of the 11 Core Criteria is broken down into several parameters. As stated in Module 1 and evidenced on Page 4 ("1.0 Scheme of Criteria related Parameters & allocated weightages"), there are a total of 78 parameters across the 11 criteria.
These parameters serve as the concrete points of evaluation during the assessment process.
Types of Parameters: The framework distinguishes between two types of parameters:
A. Qualitative Parameters (26 in total):
Nature: These parameters assess aspects that are often descriptive, process-oriented, or rely on expert judgment and observation rather than purely numerical data. They evaluate the "how" and "what quality" of certain institutional functions or provisions.
Assessment Method: Evaluation of qualitative parameters typically involves:
Review of documented policies, procedures, and plans.
Interaction with faculty, students, and staff.
Observation of facilities and processes during site visits.
Assessing the adequacy, functionality, and alignment with prescribed standards.
Examples from the Document (referencing Page 11, Criterion 1 - Curriculum Implementation):
Parameter 1: "Implementation of Curriculum by Institution/College in alignment with Program Specific Competences laid down by NMC." (Weightage: 30) – This assesses the degree of alignment and the process of implementation, not just a count.
Parameter 2: "Planning & offering of Elective Courses being offered by College/Institution within scope laid down by NMC." (Weightage: 10) – This looks at the process of planning and the nature of offerings.
Parameter 3: "Functioning of College Council, Curriculum Committees & Medical Education Unit (MEU)." (Weightage: 20) – This evaluates the effectiveness and adherence to roles of these bodies.
Scoring: Even though qualitative, these parameters are still scored based on defined rubrics that translate observed quality or process adherence into a performance level (typically 1-4), which then corresponds to a score.
B. Quantitative Parameters (52 in total):
Nature: These parameters assess aspects that can be measured numerically or through verifiable data points. They focus on "how much," "how many," or "what percentage."
Assessment Method: Evaluation of quantitative parameters typically involves:
Collection and verification of specific data from the institution (e.g., numbers, counts, financial figures, percentages).
Analysis of this data, often involving normalization to allow for fair comparisons between institutions of different sizes or contexts.
Examples from the Document (referencing Page 11, Criterion 1 - Curriculum Implementation):
Parameter 4: "Faculty wise completed Faculty Development Programmes (FDP) vis-a-vis FDP Guidelines of NMC." (Weightage: 10) – This would involve counting the number or percentage of faculty completing FDPs.
Parameter 5: "No. of Collaborations/MOU's with National & International Institutions in the past 2 years." (Weightage: 10) – This is a direct count.
Parameter 6 (Page 12): "Outcomes of MOUs/Agreement signed for Collaboration/Partnering with Institutions in India & abroad vis-a-vis Parameter-5." (Weightage: 20) – While "outcomes" can be qualitative, this parameter is listed as quantitative, suggesting specific, measurable outcomes are expected (e.g., number of joint publications, number of exchange programs conducted).
Scoring: Quantitative parameters are scored based on the collected data, often after applying normalization formulas detailed in the rubrics, to arrive at a performance level and corresponding score.
Relationship to Criteria: Parameters are subsumed under their respective criteria. The collective performance on all parameters within a criterion determines the overall performance for that criterion.
3.2. Weightages: Assigning Relative Importance
Definition and Purpose:
Weightage refers to the numerical value or proportion of the total score assigned to each criterion and, subsequently, to each parameter within that criterion.
The purpose of assigning weightages is to reflect the relative importance of different aspects of medical education and institutional functioning as perceived by the NMC. Aspects deemed more critical to quality will carry a higher weightage.
The total allocated weightage for the entire framework is 1000 points (as seen on Page 4).
Allocation of Weightages:
A. Criterion-Level Weightage:
The document (Page 4, "1.0 Scheme of Criteria related Parameters & allocated weightages") clearly lists the total allocated weightage for each of the 11 criteria.
Examples:
Criterion 1 (Curriculum Implementation...): 100 points
Criterion 3 (Teaching-Learning Environment...): 130 points
Criterion 5 (Human Resource & Teaching-Learning Process): 160 points (highest weightage)
Criterion 10 (Quality Assurance System): 30 points (lowest weightage)
B. Parameter-Level Weightage:
The total weightage allocated to a criterion is further distributed among its constituent parameters.
The section "3.0. Allocation of weightages to Parameters subsumed under Criteria" (Pages 10-20) details this breakdown.
Example (from Page 11, under Criterion 1 which has a total of 100 points):
Parameter 1.1 (Implementation of Curriculum...): 30 points
Parameter 1.2 (Planning & offering of Elective Courses...): 10 points
Parameter 1.3 (Functioning of College Council...): 20 points
Parameter 1.4 (Faculty wise completed FDP...): 10 points
Parameter 1.5 (No. of Collaborations/MOU's...): 10 points
Parameter 1.6 (Outcomes of MOUs...): 20 points
(Sum for Criterion 1 = 30+10+20+10+10+20 = 100 points)
Implications of Weightages:
Colleges should pay close attention to the weightages, as performance in high-weightage areas will have a more significant impact on their overall score and subsequent ranking.
The weightage distribution provides insight into the NMC's priorities for quality medical education.
3.3. Operational Definitions & Scoring Rubrics: The Guidelines for Measurement
This is the most detailed part of the framework, providing specific instructions on how each parameter will be assessed and scored. Section "4.0. Operational Definition & Scoring Rubrics for Parameters subsumed under Criteria" begins on Page 20 and extends throughout much of the document.
A. Operational Explanation:
Purpose: For each parameter (or a group of closely related sub-parameters), an "Operational Explanation" is provided. This section:
Defines the scope and intent of the parameter.
Clarifies what specific aspects or information the assessors will be looking for.
Often references specific NMC regulations, guidelines, or source documents (e.g., GMER-2023, MSR-2023, FDP Guidelines).
May outline specific verification processes or data collection requirements particular to that parameter.
Example (from Page 20, Parameter 1.1 - Implementation of Curriculum...):
The explanation details that "practices of college pertaining to implementation of Competence Based Curriculum will be verified," including alignment of theory/practical/clinical experiences, facilitation of horizontal/vertical integration, and how competencies are developed (e.g., specification of learning domains, proficiency levels, teaching methods, assessment tools).
B. Scoring Rubrics:
Structure: Following the operational explanation, detailed "Scoring Rubrics" are provided for each parameter or its sub-parameters. These rubrics typically follow a structured format:
Levels of Performance: Most rubrics define 4 Levels of Performance (Level-1, Level-2, Level-3, Level-4). Level-1 usually represents the lowest level of compliance or performance, while Level-4 represents the highest.
Descriptors for Each Level: Each level is accompanied by a specific, observable descriptor that outlines the criteria a college must meet to achieve that level. These descriptors provide clarity on what constitutes poor, average, good, or excellent performance.
Example (from Page 22, Sub-parameter 1.1.1 - Alignment with Competences):
Level-1: "If Less than 50% sampled Faculties are able to show documented evidences about alignment..."
Level-2: "If 50% to 70% sampled Faculties are able to show documented evidences about alignment..."
Level-3: "If 71% to 90% sampled Faculties are able to show documented evidences about alignment..."
Level-4: "If more than 90% sampled Faculties are able to show documented evidences about alignment..."
"Plus" Levels: For some parameters, intermediate levels like "Level-2 plus" or "Level-3 plus" are used (e.g., Page 25, Parameter 1.2.2; Page 27, Parameter 1.3.1). These often indicate achieving the base level criteria plus an additional specific achievement.
Supporting Documents: The rubrics explicitly list the types of supporting documents or evidence that colleges will need to provide to substantiate their performance claims for each parameter (e.g., "Curriculum Plan, Teaching/Lesson Plans, Subject Attendance Register etc." on Page 22). This is crucial for the verification process.
C. Normalization Formulas (for many Quantitative Parameters):
Purpose: For many quantitative parameters where raw data might vary significantly due to factors like college size, intake capacity, or patient demographics, normalization formulas are applied. This is to ensure a fair and standardized comparison across diverse institutions. The goal is often to convert raw data into a common scale (e.g., 0 to 100).
Typical Formula Structure: A common formula seen in the document (e.g., Page 33 for MOUs; Page 42 for OPD Attendance) is:
((Average score of College on the parameter (x’)) - (Minimum obtained score on the parameter (x))) / ((Maximum obtained score on the parameter (y)) - (Minimum obtained score on the parameter (x))) * 100
Where:
x’ = Score of the concerned college.
x = Minimum score obtained by any college across all colleges for that parameter.
y = Maximum score obtained by any college across all colleges for that parameter.
Application: The normalized score then determines the performance level (Level-1 to Level-4) based on predefined cut-offs (e.g., ≤25, >25 to ≤50, >50 to <75, ≥75).
Notes on Specifics: The document often includes important "Notes" clarifying aspects like the denominator to be used for averaging (e.g., sanctioned intake, number of faculty), or how to handle specific scenarios (e.g., if a college offers both UG and PG programs).
D. Computation of Weightage Score for Parameters:
For parameters that have multiple sub-parameters assessed through the rubrics, the document often provides a formula to calculate the overall weighted score for that main parameter.
Example (from Page 24, for Parameter 1.1, which has 5 sub-parameters):
Weightage score on Parameter-1.1 = ((Obtained score on 1.1.1)/4 + (Obtained score on 1.1.2)/4 + (Obtained score on 1.1.3)/4 + (Obtained score on 1.1.4)/4 + (Obtained score on 1.1.5)/S) * AW
Where S is likely the maximum possible score for sub-parameter 1.1.5 (if different from 4) and AW is the Assigned Weightage to Parameter-1.1 (which is 30 points). The division by 4 (or S) normalizes each sub-parameter's score before applying the overall weightage.
3.4. The Verification Process: Substantiating Performance
While not a direct part of "scoring," the verification process is intrinsically linked to how scores are determined and is detailed within the operational explanations and rubrics.
Desktop Assessment: Initial review of data and documents submitted by the college through the portal.
Physical Visits: On-site verification by the assessment team.
Interactions:
With sampled faculty (e.g., 25% from each department for Criterion 1).
With sampled students (across different professional years, with specific sampling strategies like CLT considerations mentioned on Page 21).
With Heads of Departments, MEU members, Curriculum Committee members, etc.
Document Review: Thorough examination of all supporting documents listed in the rubrics (logbooks, lesson plans, financial records, policy documents, etc.).
Physical Verification of Facilities: Inspection of libraries, labs, hospitals, hostels, etc.
Data Authentication: For critical data like OPD/IPD attendance, random sampling and physical verification are mentioned (e.g., Page 41), with provisions for score reduction if data manipulation is found.
Module 3 Summary and Transition:
This module has provided an in-depth look at the fundamental mechanics of the NMC's assessment framework. We've explored:
Parameters as the specific, measurable units of evaluation, categorized into qualitative and quantitative types.
Weightages as the mechanism for assigning relative importance to different criteria and parameters, guiding where colleges should focus their quality improvement efforts.
Operational Definitions and Scoring Rubrics as the detailed "how-to" guide for assessment, outlining what is measured, how it's measured (including normalization), the expected levels of performance, and the evidence required.
The integral role of a robust Verification Process in applying these rubrics and ensuring the credibility of the scores.
A clear grasp of these components – what will be looked at (parameters), how much it matters (weightages), and how performance will be judged (rubrics and verification) – is paramount for any medical college preparing for this accreditation and ranking process. The next logical step would be to examine the specific application of these principles within each of the 11 criteria.
Module 4: Key Areas of Detailed Assessment – Applying the Framework in Practice (Examples from Rubrics)
This module will illuminate the practical application of the NMC's Draft Framework by examining specific examples of parameters, their operational explanations, scoring rubrics, and verification methods as detailed in the document. This will provide a tangible understanding of the depth and rigor of the assessment process.
4.1. Criterion 1: Curriculum Implementation and Capacity Building Activities (Weightage: 100)
Focus: Effective implementation of Competency-Based Curriculum (CBC) and faculty preparedness.
Example Parameter 1.1: Implementation of Curriculum by Institution/College in alignment with Program Specific Competences laid down by NMC (Weightage: 30; Qualitative)
Operational Explanation (Page 20): This involves verifying how the college aligns theory, practical, and clinical experiences with prescribed competencies, facilitates horizontal/vertical integration, and develops competencies through specified learning domains (K, KH, S, SH, P), teaching methods, and assessment tools. References GMER-2023 and Competency-Based UG Curriculum documents.
Verification: Involves interaction with randomly selected faculty (25% from each pre-clinical, para-clinical & clinical department, not members of Curriculum Committee) and students (5% from each professional year, total 30-60 students, sampled based on CLT).
Scoring Rubrics (Pages 22-23):
Sub-parameter 1.1.1 (Alignment with Competences):
Level-1 to Level-4: Based on the percentage of sampled faculties ( <50%, 50-70%, 71-90%, >90%) able to show documented evidence of alignment of Theory & Practical/clinical sessions with prescribed Competencies.
Supporting Documents: Curriculum Plan, Teaching/Lesson Plans, Subject Attendance Register etc.
Sub-parameter 1.1.2 (Specifications of Learning Objectives):
Level-1 to Level-4: Based on the percentage of sampled faculties able to show how theory & Practical/Clinical sessions are conducted in sync with competency-wise Learning Objectives.
Sub-parameter 1.1.3 (Specification of Competency or Proficiency Levels):
Level-1 to Level-4: Based on the percentage of sampled faculties able to show how sessions are organized by specifying Competency/proficiency levels based on Miller’s Pyramid (K, KH, S, SH & P).
Sub-parameter 1.1.4 (Integrated Teaching-Learning):
Level-1 to Level-4: Based on the percentage of sampled faculties able to show documented evidence of how Integrated teaching-learning sessions (Horizontal & Vertical) are planned & conducted.
Sub-parameter 1.1.5 (Interaction with Sampled Students):
Level-1 to Level-3 plus: Based on the percentage of sampled students (<30%, 30-50%) able to tell the type of competencies taught, describe recently organized integrated sessions, and identify proficiency levels required.
Supporting Documents: Log Books of students etc.
Weightage Calculation for Parameter 1.1 (Page 24): A formula aggregates scores from these 5 sub-parameters, multiplied by the Assigned Weightage (AW) of 30.
Example Parameter 1.4: Faculty wise completed Faculty Development Programmes (FDP) vis-à-vis FDP Guidelines of NMC (Weightage: 10; Quantitative)
Operational Explanation (Page 29): Derived from NMC guidelines for FDPs like BCME, CISP, ACME/FIME. Captures faculty-wise completion of these programs.
Scoring Rubrics (Pages 30-31):
Sub-parameter 1.4.1 (Percentage of Faculties completed BCME/rBCW):
Level-1 to Level-4: Based on percentage brackets (<50%, 50-70%, 71-90%, >90%) of faculties completing BCME within NMC scope.
Supporting Documents: Certificates from RC/NC or in-house FDP (with Observer/Coordinator signature).
Sub-parameter 1.4.2 (Percentage of Curriculum Committee Members completed BCME...): Similar levels for CC members.
Sub-parameter 1.4.3 (MEU Members completed BCME...): Similar levels for MEU members.
Sub-parameter 1.4.4 (No of faculties completed ACME at Nodal Centre): Levels based on actual numbers (No faculty, Min 1, 2 or more, 4 or more).
Sub-parameter 1.4.5 (Percentage of Faculties completed CISP): Similar percentage brackets as 1.4.1.
Weightage Calculation for Parameter 1.4 (Page 31): Formula aggregates scores from these 5 sub-parameters.
4.2. Criterion 2: Clinical Exposure, Clinical Training, Internship and Clinical Facilities (Weightage: 100)
Focus: Ensuring adequate hands-on clinical experience and availability of clinical resources.
Example Parameter 2.2: Specialty/Clinical Department wise OPD Attendance in the calendar year (Weightage: 15; Quantitative)
Operational Explanation (Page 41): Emphasizes patient load dependency for quality training. Mentions MSR-2023 minimum daily OPD attendance (8 patients/sanctioned intake).
Verification: Physical verification of randomly selected OPD Data. Auto-generated sampled months/days for verification. If data manipulation is found, OPD data shall be reduced.
Scoring Rubrics (Page 42):
Normalization Formula: ((x’) - (x)) / ((y) - (x)) * 100 where x’ is college's average per intake OPD data, y is max across colleges, x is min across colleges. Sanctioned intake of UG or UG+PG programs used as denominator.
Level-1 to Level-4: Based on normalized score ranges (≤25, >25 to ≤50, >50 to <75, ≥75).
Sub-parameters (2.2.1 to 2.2.4, Page 42-43): This scoring is applied separately for:
OPD Attendance across all clinical departments/specialties annually.
OPD Attendance across Medicine and allied specialties.
OPD Attendance across Surgery and allied specialties.
OPD Attendance in Obstetrics and Gynaecology.
Supporting Documents: Old & new Out-patient load data, OPD Register, Cash Receipts, Physical verification.
Weightage Calculation for Parameter 2.2 (Page 44): Formula aggregates scores from these 4 sub-parameters.
Example Parameter 2.4: Specialty wise Number of Minor surgeries performed in OT in past 1 Year (Weightage: 10; Quantitative)
Operational Explanation (Page 46): Emphasizes need for varied clinical materials, including minor surgeries in OTs. All surgeries under local anaesthesia are treated as minor surgeries.
Scoring Rubrics (Page 47):
Normalization Formula: Similar to OPD, based on average per intake performed minor surgeries.
Level-1 to Level-4: Based on normalized score ranges.
Supporting Documents: Data of minor operative works, Physical verification.
4.3. Criterion 4: Students’ Admission, Attainment of Competence & Progression (Weightage: 140)
Focus: Quality of student intake, competency attainment, and progression.
Example Parameter 4.1: Demonstration of procedures by Sampled students in Procedure & Clinical Skill Laboratory/Simulated Setting (Weightage: 35; Qualitative)
Operational Explanation (Page 81): Students gain hands-on experience in labs/simulated settings before actual patient interaction, developing mastery over skill competencies by operating over SPs or Computer-Based Simulations.
Sampling (Page 82): Total 30-60 students from all professional years. First & Second Prof. students demonstrate in Skill Lab/simulated setting. Stratified sampling based on NEET/University Exam performance (high, average, other).
Scoring Rubrics (Pages 83-86):
Multiple sub-parameters (4.1.1 to 4.1.9) for different subjects (Human Anatomy, Physiology, Biochemistry, AETCOM, Pathology, Pharmacology, Microbiology, Forensic Medicine, AETCOM-Skill Lab).
Level-1 to Level-4: Based on group performance (3 students per task) achieving <40%, 41-60%, 61-80%, >80% correctness.
Supporting Documents: App-based auto-generation of group of students, random assignment of tasks, app-based evaluation.
Weightage Calculation for Parameter 4.1 (Page 87): Averages scores from all 9 sub-parameters.
4.4. Criterion 5: Human Resource & Teaching-Learning Process (Weightage: 160)
Focus: Faculty quality, quantity, stability, and academic engagement.
Example Parameter 5.5: Staff attrition rate in past 2 Years (Weightage: 15; Quantitative)
Operational Explanation (Page 114): Quantifies rate at which faculty (Professor, Asso. Prof, Asst. Prof) and other teaching staff (Sr. Resident/Tutor/Demonstrator) depart. Vacant positions due to resignation/termination considered; retirement not considered if within last 2 years, otherwise considered.
Scoring Rubrics (Page 115):
Sub-parameters (5.5.1 to 5.5.3): Separate for Assistant Professor, Associate Professor/Reader, Professor.
Level-1: If >25% of teaching staff left.
Level-2: If <25% of teaching staff left.
Level-3: If <15% of teaching staff left.
Level-4: If <5% of teaching staff left.
Supporting Documents: Data of Faculty who resigned/terminated/retired provided by College on Assessment & Rating Portal.
Weightage Calculation for Parameter 5.5 (Page 115): Averages scores from the 3 sub-parameters (with a typo in the formula, likely should be /3 not +8 unless 8 is a base score).
4.5. Criterion 6: Assessment Policy: Formative, Internal & Summative Assessment (Weightage: 60)
Focus: Robustness of assessment practices aligned with CBC.
Example Parameter 6.1: Regular Periodical Internal Assessment (IA) Examinations for theory & Practical/Clinical vis-à-vis NMC Guideline for Competence Based Assessment (CBA) (Weightage: 15; Qualitative)
Operational Explanation (Page 135): IA exams (Theory, Practical/Clinical, Log Book, PDP-AETCOM) conducted prior to year-end Summative Assessment. Data for all professional years checked.
Verification: Faculty & student interaction.
Scoring Rubrics (Pages 136-140): Multiple sub-parameters:
6.1.1 (Conduct of required number of IA): Based on conducting exams as per prescribed numbers.
6.1.2 (Planning & Conduct of Theory Papers): Based on % of sampled faculties showing evidence of guidelines for developing theory papers aligned with K/KH levels.
6.1.3 (Planning & Conduct of Practical/Clinical Skill Assessments): Similar for S/SH/P levels.
6.1.4 (Planning and Conducting assessment of PDP-AETCOM): Based on inclusion of AETCOM questions and testing of acquired skills.
6.1.5 (Objective & Structured scoring process): Based on % of faculties able to produce objective, structured & self-explanatory scoring criteria.
6.1.6 (Scoring Performance of Students in IA - Interaction): Based on % of sampled students able to tell which assessment tools were used and how scoring criteria were applied.
Supporting Documents: Answer sheets, filled assessment tools, IA Theory Papers, Logbooks, Scoring Sheets.
Weightage Calculation for Parameter 6.1 (Page 140): Formula aggregates scores from these 6 sub-parameters.
4.6. Criterion 7: Research Output & Impact (Weightage: 100)
Focus: Institution's contribution to research and innovation.
Example Parameter 7.1: Total number of research paper publications by Faculty Staff with Institutional Affiliation in last 2 Years in indexed Journals (Weightage: 15; Quantitative)
Operational Explanation (Page 150): Considers papers in Medline/PubMed, Central Science Citation Index, etc. (DoAJ). List of indexed journals from 'Teacher Eligibility Qualifications in Medical Institutions-2022'.
Scoring Note: Each Q1 journal paper = 200 score, Q2 = 150, Q3/Q4 = 100. Paper considered once. If First author not associated with college, 50% score.
Normalization Formula (Page 149 for Parameters 7.1, 7.2, 7.3): Standard normalization based on per faculty average score. Faculty recruited for MBBS programme only.
Scoring Rubrics (Page 151):
Level-1 to Level-4: Based on normalized score ranges (≤25, >25 to ≤50, >50 to <75, ≥75).
Supporting Documents: Submission details, uploaded soft copies of research papers published in indexed journals for given database only.
4.7. Criterion 8: Financial-Resource: Recurring & non-recurring expenditures (Weightage: 100)
Focus: Financial investment as a proxy for effective teaching-learning.
Example Parameter 8.1: Total amount of Books & Journals and other Learning Resources purchased in previous financial year (Weightage: 10; Quantitative)
Operational Explanation (Page 162): Data on amount spent on new books & journal subscriptions. Average value per sanctioned intake calculated.
Normalization: Formula given at the beginning of Criterion-8 (Page 160).
Scoring Rubrics (Page 162):
Level-1 to Level-4: Based on normalized score ranges.
Supporting Documents: List of Books & Journals procured/subscribed, Invoices, Purchase Order, Tax Invoices and Receipts etc.
4.8. Criterion 3: Teaching –Learning Environment (Selected Safety Parameters)
Focus: Ensuring a safe and compliant campus.
Example Parameter 3.8: Provisions of Biomedical Waste Management in Medical College (Weightage: 10; Qualitative)
Operational Explanation (Page 75): Compliance with Bio-medical Waste (Management & Handing) Rules, 2019. Robust institutional policy for segregation and discarding.
Scoring Rubrics (Pages 75-76):
Level-1: Functional BMW Management Committee constituted.
Level-1 plus: Medical College has established physical facilities (segregation, transportation, disposal).
Level-2 plus: College maintains annual reports for BMW generation and Incineration Ash/ETP sludge details.
Level-3 plus: >80% health care workers trained in BMW; 100% directly engaged workers immunized (Hep B/Tetanus).
Supporting Documents: Constitution of BMW Committee, MOMs, Physical facilities, Annual reports, Training/immunization records.
Example Parameter 3.11: Provisions for Fire Safety in Campus (Teaching Block, Hospital Block & Hostel Block) (Weightage: 10; Qualitative)
Operational Explanation (Page 78): Compliance with UG Program regulations & NBC-2016 for fire safety measures.
Scoring Rubrics (Pages 79-81):
Sub-parameter 3.11.1 (Fire NOC): Levels based on availability of Fire NOC for teaching blocks, hostel blocks, attached hospital.
Sub-parameters 3.11.2 to 3.11.4 (Firefighting equipment & preparedness for Medical College-Teaching Blocks, Hostel Blocks, Attached Teaching Hospital respectively): Each has sub-levels for:
Fire Extinguishers: Functional and prominently placed.
Level-1 plus (Fire Alarm System): Functional system.
Level-2 plus (Evacuation & Exit Plan): Prominently visible.
Level-3 plus (Mock Drills & QAS): Regular mock drills; established QAS for fire safety.
Supporting Documents: Fire NOCs, AMCs for Fire Extinguishers, Records of Mock Drills, QAS documents.
Weightage Calculation for Parameter 3.11 (Page 81): Formula aggregates scores from these 4 sub-parameters.
Module 4 Summary and Transition:
This module has provided a deep dive into specific examples from the Draft Framework, illustrating how parameters under various criteria are defined, assessed, and scored. We have seen the interplay of operational explanations, detailed multi-level rubrics, the requirement for specific supporting documents, and the use of normalization formulas for quantitative data. The emphasis on physical verification, faculty and student interactions, and data authentication highlights the comprehensive nature of the proposed assessment.
These examples should give stakeholders a clearer picture of the expectations and the level of detail involved in the accreditation and ranking process. The next module will focus on the crucial aspect of stakeholder engagement – how feedback is being solicited and its importance in refining this comprehensive framework.
Module 5: Call for Stakeholder Engagement – Shaping the Future of Medical Education Assessment
This module focuses on a critical phase in the development of the "Draft Framework for Accreditation and Ranking of Colleges Regulated by National Medical Commission (NMC)": the solicitation of feedback and suggestions from all relevant stakeholders. The National Medical Commission (NMC) recognizes that a framework of this magnitude and impact requires collaborative input to ensure its effectiveness, practicality, and widespread acceptance.
5.1. The Rationale for Stakeholder Engagement - Why is Feedback Crucial?
The NMC's decision to open the draft framework for public and stakeholder consultation is a testament to a transparent and participatory approach to policy development. The rationale behind this engagement includes:
Enhancing Relevance and Practicality:
Stakeholders, particularly those directly involved in medical education (Deans, Principals, faculty, students), possess invaluable on-the-ground experience. Their feedback can help identify if the proposed criteria and parameters are realistic, achievable, and truly reflective of quality medical education.
They can point out potential operational challenges in implementing certain assessment methods or providing the required documentation.
Incorporating Diverse Perspectives:
Medical education impacts a wide array of groups – educators, learners, healthcare providers, patients, and regulatory bodies. Each group may have unique insights and concerns.
Engaging diverse stakeholders ensures that the framework considers different viewpoints, leading to a more balanced and holistic assessment tool.
Identifying Ambiguities and Gaps:
A draft document, despite careful preparation, may contain ambiguities, inconsistencies, or areas that lack clarity. Stakeholders can help pinpoint these issues.
They might also identify potential gaps – aspects of quality medical education that the current draft might not adequately address.
Improving Clarity and Language:
Feedback can help refine the language of the framework, ensuring that definitions, criteria, and rubrics are clearly understood by all parties involved in the assessment process.
Fostering Ownership and Acceptance:
When stakeholders are involved in the development process, they are more likely to understand, accept, and support the final framework. This buy-in is crucial for successful implementation and for achieving the desired improvements in medical education quality.
Ensuring Fairness and Equity:
Stakeholders can provide input on whether the framework is fair and equitable to different types of medical colleges (e.g., new vs. established, government vs. private, those in resource-limited settings).
Aligning with Best Practices:
Public consultation is a recognized best practice in the development of significant regulatory frameworks and quality assurance standards globally.
5.2. Who are the Intended Stakeholders? - A Broad Spectrum of Contributors
The Public Notice explicitly addresses the call for comments to:
"Dean/Principals of all Medical Colleges (under the purview of NMC)": This group is at the forefront of implementing educational policies and will be directly assessed by this framework. Their practical insights into college operations, curriculum delivery, and resource management are invaluable.
"All Stakeholders": This is a broad and inclusive term, clearly indicating that the NMC is seeking input from anyone with a vested interest in the quality of medical education in India. This can include, but is not limited to:
Faculty Members: Teachers, professors, and clinical instructors who are responsible for educating medical students.
Medical Students and Interns: The primary beneficiaries and subjects of the educational process; their perspective on learning experiences, facilities, and support systems is vital.
Medical Professional Bodies and Associations: Organizations representing doctors and specialists (e.g., Indian Medical Association, specialty associations) who have an interest in the competency of future medical graduates.
Researchers and Academicians in Medical Education: Experts who can provide theoretical and evidence-based insights into assessment methodologies and quality benchmarks.
Hospital Administrators and Healthcare Providers: Those who employ medical graduates and can provide feedback on the preparedness of these graduates for the healthcare system.
Alumni of Medical Colleges: Individuals who have experienced the education system and can offer retrospective insights.
Regulatory Bodies and Government Agencies: Other entities involved in healthcare and education governance.
Parents and Guardians of Medical Students: Those who invest in medical education and have an interest in its quality.
The General Public and Patient Advocacy Groups: Ultimately, the quality of medical education impacts the quality of healthcare received by the public.
5.3. The Submission Process - How to Provide Feedback
The NMC has outlined a clear and accessible process for submitting comments and suggestions:
Method of Submission:
Feedback is to be submitted "through an online form." This digital approach facilitates:
Ease of access for stakeholders across the country.
Standardized collection of feedback, making it easier to collate and analyze.
Efficient management of a potentially large volume of responses.
The Online Form Link:
The specific link provided in the Public Notice is: https://forms.gle/nuk5fJgnxyKQZudk9
This is a Google Forms link, a widely used platform for creating and distributing online surveys and forms.
Stakeholders would need to access this link to input their comments. The form likely has structured fields to guide the feedback, possibly aligned with the criteria or sections of the draft framework.
Nature of Feedback Solicited:
The notice requests "comments/suggestions." This implies that stakeholders can:
Comment on existing provisions:指出优点、缺点、潜在问题或不明确之处。
Suggest improvements: Propose alternative approaches, additions, deletions, or modifications to the draft.
5.4. The Deadline for Submission - A Time-Bound Process
Timeframe: Stakeholders are requested to submit their comments/suggestions "within 21 days from the date of publication of this notice."
Date of Public Notice Publication: The notice is clearly dated "10-05-2025."
Calculating the Deadline:
Counting 21 days from May 10, 2025:
May 10 (Day 0/Publication Day)
May 11 (Day 1) ... May 30 (Day 20) ... May 31, 2025 (Day 21)
Therefore, the approximate deadline for submitting feedback is May 31, 2025. Stakeholders must adhere to this timeline for their input to be considered.
Importance of the Deadline: Adhering to the deadline is crucial as the NMC will need time to process the feedback, deliberate on the suggestions, and move towards finalizing the framework.
5.5. Contact Point and Official Endorsement
Signatory of the Notice: The Public Notice is signed by Raghav Langer, SECRETARY, National Medical Commission.
Significance:
This signature lends official authority to the call for comments and underscores the NMC's commitment to this consultative process.
The inclusion of his email address (r.langer@ias.nic.in) serves as a potential point of contact for high-level clarifications, although the primary mode for feedback is the online form. It also reinforces the authenticity of the communication.
5.6. Expected Outcome of the Stakeholder Engagement
Once the submission period closes, the NMC (likely the MARB and the Coordination Division) will undertake the significant task of:
Collating and Reviewing Feedback: Systematically gathering all submitted comments and suggestions.
Analyzing Input: Evaluating the nature, validity, and feasibility of the feedback received.
Deliberation and Revision: Discussing the substantive suggestions and deciding on potential revisions to the draft framework. This may involve further internal consultations or expert reviews.
Finalization of the Framework: Incorporating appropriate changes to produce a final version of the "Framework for Accreditation and Ranking of Colleges Regulated by National Medical Commission."
Notification and Implementation: Once finalized, the framework will likely be officially notified and subsequently implemented, with the independent third-party agency commencing the assessment process based on its provisions.
Module 5 Summary:
Module 5 has highlighted the critical role of stakeholder engagement in refining the NMC's Draft Framework for medical college accreditation and ranking. The NMC is actively seeking diverse feedback through a clearly defined online process with a specific deadline (approximately May 31, 2025). This participatory approach aims to ensure the final framework is robust, practical, fair, and widely accepted, ultimately contributing to the enhancement of medical education standards across India. The involvement of Deans/Principals and "All Stakeholders" underscores the inclusive nature of this consultation. The official endorsement by the NMC Secretary further legitimizes this crucial step in the framework's development.
Criterion 1: Curriculum Implementation and Capacity Building Activities (Total Allocated Weightage: 100 points)
This criterion is fundamental to the NMC's vision for modern medical education in India. It assesses the extent to which medical colleges are effectively translating the prescribed curriculum, particularly the Competency-Based Curriculum (CBC) for the UG Program, into practice. It also evaluates the efforts made by institutions to build the capacity of their faculty and students to engage with this curriculum.
Overall Operational Explanation (as per Page 5 and Page 20):
Core Focus: To capture information related to the implementation of the Competence-Based Curriculum (CBC) prescribed by Medical Regulators for the UG Program. This is a shift from traditional, content-heavy curricula to one focused on graduates achieving specific, observable competencies.
Assessment of Process: Qualitative parameters within this criterion predominantly relate to the process aspects of curriculum implementation. This means assessors will be looking at how the curriculum is delivered, integrated, and supported.
Faculty and Student Interaction: Verification will involve direct interaction with:
Sampled Faculty Staff: Representation from all subjects (pre-clinical, para-clinical, clinical) will be interacted with to understand their role and experiences in curriculum implementation.
Sampled Students: Students from all professional years will be interacted with to capture their perspectives on the curriculum and its delivery.
Data Capture Format (DCF): Medical colleges will be required to provide information related to each parameter using a designed Data Capture Format.
Alignment with NMC Regulations: The parameters are designed to assess compliance with standards laid down by the Undergraduate Medical Education Board (UGMEB) for Competency-Based Medical Education, provisions for Faculty Development Programs (FDPs), and aspects related to academic excellence as mentioned in the Assessment and Rating Regulations-2023.
Detailed Breakdown of Parameters under Criterion 1:
Operational Explanation (Page 20):
Focuses on verifying how the college implements the CBC.
Key aspects include:
Aligning Theory, practical & Clinical experiences with prescribed Competencies.
Facilitating horizontal & vertical integration among competencies prescribed for pre-clinical, para-clinical & clinical subjects.
Methods for developing competencies, including:
Specification of Learning Domains (Cognitive, Affective, Psychomotor) for desired behavioral changes.
Specification of Proficiency or Competency levels (K-Knows, KH-Knows How, S-Shows, SH-Shows How, P-Performs independently) for practical & clinical skills.
Suggested teaching/training methods for Faculty.
Suggested assessment methods and tools for evaluating and monitoring mastery of competencies.
Sources: GMER-2023, Competency-Based UG Curriculum for Indian Medical Students Vol. I, II & III-2018.
Verification (Page 21):
Interaction with 25% of faculties from each pre-clinical, para-clinical & clinical subject/department (randomly selected, not members of Curriculum Committee).
Interaction with 5% of total enrolled students from each professional year (First, Second, Third Part-1, Third Part-2), with a total sample size between 30 and 60 students (Central Limit Theorem consideration).
Scoring Rubrics (Pages 22-23):
Sub-parameter 1.1.1: Alignment with Competences:
Assesses if sampled faculties can show documented evidence (Curriculum Plan, Lesson Plans, Attendance Registers) of aligning theory and practical/clinical sessions with prescribed competencies.
Levels 1-4: Based on <50%, 50-70%, 71-90%, >90% of faculties demonstrating this.
Sub-parameter 1.1.2: Specifications of Learning Objectives:
Assesses if sampled faculties can show how sessions are conducted in sync with competency-wise Learning Objectives aligned with Learning Domains.
Levels 1-4: Based on similar faculty percentage brackets.
Sub-parameter 1.1.3: Specification of Competency or Proficiency Levels:
Assesses if sampled faculties can show how sessions are organized by specifying Competency/proficiency levels based on Miller’s Pyramid (K, KH, S, SH & P).
Levels 1-4: Based on similar faculty percentage brackets.
Sub-parameter 1.1.4: Integrated Teaching-Learning:
Assesses if sampled faculties can show documented evidence of planning and conducting Integrated teaching-learning sessions (Horizontal & Vertical Integration).
Levels 1-4: Based on similar faculty percentage brackets.
Sub-parameter 1.1.5: Interaction with Sampled Students:
Assesses if sampled students can:
Level-1 (<30%): Tell the type of competencies taught.
Level-2 plus (30-50%): Also tell about recently organized integrated teaching-learning sessions.
Level-3 plus (30-50%): Also tell about proficiency/competency levels (K, KH, S, SH & P) they were required to acquire
Supporting Documents: Log Books of students.
Weightage Calculation (Page 24): The scores from these 5 sub-parameters (normalized by dividing by their maximum score, typically 4) are summed and then multiplied by the Assigned Weightage (AW) of 30 for Parameter 1.1.
Operational Explanation (Page 24):
Deals with elective courses offered to provide additional learning experiences.
Refers to the erstwhile MCI framework (Modules on Elective Courses) and NMC suggestions (e.g., AI, Indian system of medicine, medical photography, art/music in Medicine, medical subjects).
Electives (1 month) are in 2 blocks of 15 days each in the Final Professional year (1st block after annual exam of III MBBS part I, 2nd block after the end of 1st elective).
Verification (Page 24-25): Interaction with sampled faculties and students of Third Professional Part-1 & 2.
Scoring Rubrics (Pages 25-26):
Sub-parameter 1.2.1: Planning of Electives:
Assesses documentation of Learning Objectives (LOs), List of Activities, Preceptors/Supervisors, Prerequisites, Assessment Methods, and required entries in Logbooks/Portfolio.
Level-1 to Level-4: Based on the percentage of electives opted by students in the previous/ongoing academic calendar for which the college produces documented evidence for these planning aspects (<50%, 50-70%, 71-90%, >90%).
Supporting Documents: Teaching Plans, Curriculum Plan, Logbooks, Elective-wise decided LOs, List of Activities.
Sub-parameter 1.2.2: Conduct of electives vis-à-vis sampled students:
Assesses student confirmation of receiving choice-based options and notification of elective details (preceptors, activities, prerequisites) before selection.
Level-1 (<50% students confirm): Provided with choice-based options.
Level-2 plus (50-70% students confirm): Also confirm notification of details.
Level-3 plus (50-70% students can explain): Also able to explain LOs and learning activities for their electives.
Supporting Documents: Documented Learning Evidences (Logbooks & Portfolios).
Sub-parameter 1.2.3: Documentation of Learning Evidences:
Assesses faculty's ability to explain and produce documented evidence of how student activities in electives were verified and rated for satisfactory completion.
Level-1 to Level-4: Based on the percentage of sampled faculties demonstrating this (<50%, 50-70%, 71-90%, >90%).
Weightage Calculation (Page 26): Aggregates scores from these 3 sub-parameters, multiplied by AW of 10.
Operational Explanation (Page 26):
Requires College/Institution to align academic functioning with prescribed Curriculum.
Focuses on the roles:
College Council: Meet at least 4 times/year for curriculum details, training programme, discipline, etc.
Curriculum Committee: Ensure implementation and monitoring of Curriculum.
MEU: Conduct FDPs as per faculty training needs.
Verification (Page 27): Interaction with College Council Members, Curriculum Committee members, MEU members, and sampled students (as per Parameter-1).
Scoring Rubrics (Pages 27-29):
Sub-parameter 1.3.1: College Council:
Level-1: Not constituted as per regulations.
Level-2: Constituted as per regulations.
Level-2 plus: Minimum meetings organized as per regulations.
Level-3 plus: In meetings, subject-wise implementation of curricular activities and student performance reviewed; Action Taken Reports (ATRs) on previous meeting recommendations presented/discussed.
Supporting Documents: Constitution of College Council, MOMs & Agenda, ATRs.
Sub-parameter 1.3.2: Curriculum Committee: Similar levels for constitution, meetings, and preparation of Time Table/Teaching Schedule/Clinical Posting plans.
Sub-parameter 1.3.3: Monitoring of Curriculum Implementation by Curriculum Committee:
Level-1: Not monitoring.
Level-2: Specified professional year-wise lecture/tutorial/seminar/DOAP/Bedside Clinic numbers.
Level-2 plus: Also specified average per student OSCE/OSPE, MINI CEX & DOPS based skill assessments.
Level-3 plus: Supervising curriculum implementation by faculties subject-wise for each professional year based on these indicators.
Supporting Documents: MOMs, ATRs, Review of Curriculum Implementation (Planned vs. Actuals), Periodical Reports.
Sub-parameter 1.3.4: Medical Education Unit:
Level-1: Composition not aligned with provisions.
Level-2: Composition aligned.
Level-2 plus: MEU (with Curriculum Committee) has identified faculty training needs for CBC implementation.
Level-3 plus: MEU has organized BCME & CISP with proper documentation.
Supporting Documents: Document for MEU composition, Training needs identification, FDP organization evidences.
Sub-parameter 1.3.5: Curricular Activities vis-à-vis Sampled Student:
Assesses student confirmation of academic calendar activities (Lectures, Seminars, Tutorials, DOPA, Bed Side Clinics) being held in sync with college-provided numbers, and conduct of OSCE/OSPE/MINI CEX/DOPS.
Level-1 (<50% students confirm): Activities held without major deviations.
Level-2 (50-70% students confirm): Similar to Level-1.
Level-2 plus (50-70% students confirm OSCE/OSPE): Also confirm OSCE/OSPE assessments for practical/clinical skills.
Level-3 plus (50-70% students confirm MINI CEX/DOPS): Also confirm MINI CEX/DOPS assessments for clinical skills in real healthcare settings.
Supporting Documents: Documented evidences, Subject Attendance Records, Logbooks.
Weightage Calculation (Page 29): Aggregates scores from these 5 sub-parameters, multiplied by AW of 20.
(Detailed under Module 4, Section 4.1 earlier, as it's a good example of a quantitative parameter with multiple sub-components and specific documentation requirements like certificates.)
Operational Explanation (Page 32):
Deals with MOUs executed with partnering institutions (India and abroad) for academic and research collaborations.
Aims to equip students/faculty with best practices, teaching/training methods, and offer opportunities for visits, short-duration research, strategic partnerships, workshops, conferences.
Scoring (Pages 32-34):
Differential weightage scores are given per MOU based on the category of the Academic/Research Institution collaborated with:
Category-1 (25 score/MOU): Not in NIRF/abroad ranking, OR not accredited/rated by government bodies.
Category-2 (150 score/MOU): Accredited/Rated by government recognized bodies for Higher Education/Health Education.
Category-3 (200 score/MOU): Participant in government recognized ranking (NIRF) with positions under top 50.
Category-4 (100 score/MOU): NIRF participant with positions beyond top 50.
Category-5 (200 score/MOU): Participant in world ranking (QS, THE, ARWU) within top 500.
Category-6 (100 score/MOU): Participant in world ranking beyond top 500.
Category-7 (150 score/MOU): Central/State Govt. body (specialized research/technical/funding, not academic programs).
Notes:
Collaboration for: (a) Research-based strategic partnership, (b) Organization of academic workshops/conferences/seminars.
Maximum 2 MOUs/Collaborations per category considered.
MOUs signed >2 years ago with no tangible action get no score.
Normalization Formula (Page 33): ((x’) - (x)) / ((y) - (x)) * 100 (where x' is average score per sanctioned intake for the college, y is max, x is min across all colleges).
Scoring Levels (Page 34):
Level-1 to Level-4: Based on normalized score ranges (≤25, >25 to ≤50, >50 to <75, ≥75).
Supporting Documents: Documented evidences of MOUs, accreditation/rating of collaborating institution, participation/ranks in NIRF/world ranking systems.
Operational Explanation (Page 35):
Deals with the execution/outcomes of MOUs mentioned in Parameter 1.5 (short-duration research projects, strategic partnerships, workshops, conferences).
Scoring Rubrics (Pages 35-37):
For each category of collaborating institution (similar to 1.5), separate scores are assigned for:
Collaborative Research or Academic or Clinical Project: e.g., Category-1 = 50 score, Category-2 = 100, Category-3 = 150.
Organization of per Workshop: e.g., Category-1 = 5 score per 3-4 hours, Category-2 = 10 score, Category-3 = 25 score.
Organization of per Conference or Seminars: Similar scoring to workshops.
Notes:
Research projects can be self-financed by both/collaborating institutions. If funded by an agency, reported under Criterion-7.
Seminars/Conference up to max 20 hours considered (1 full day = max 6 hrs, half day = min 3 hrs).
Workshops up to max 20 hours considered (similar hour definition).
Self-financing Collaborative Projects considered. One ongoing project during the required year or completed project during the required year considered.
Normalization & Scoring Levels (Page 37): Uses the same normalization formula and Level-1 to Level-4 score ranges as Parameter 1.5.
Supporting Documents: Video recordings of workshops, geo-tagged photos of seminars, records of resource persons, attendance records, documented evidences of research projects (proposals, published papers). Lists of faculty/students visited from both sides, communication of visits.
Criterion 1 places significant emphasis on the systematic and effective implementation of the NMC's prescribed Competency-Based Curriculum. It scrutinizes the planning and delivery of educational activities, the functioning of key academic bodies, the continuous development of faculty, and the tangible outcomes of academic collaborations. The assessment involves a mix of qualitative process evaluation (through faculty/student interactions and document reviews) and quantitative measures (like FDP completion rates, number of MOUs, and their outcomes). The detailed rubrics and verification requirements indicate a thorough and evidence-based approach to ensure that medical colleges are not just nominally adopting the curriculum but are truly embedding it into their educational ethos and practices. The weightage of 100 points signifies its importance in the overall assessment framework.
Criterion 2: Clinical Exposure, Clinical Training, Internship and Clinical Facilities (Total Allocated Weightage: 100 points)
This criterion is of paramount importance as it directly evaluates the quality and quantum of hands-on clinical experience provided to undergraduate medical students. The ability of a medical college to offer diverse, adequate, and well-supervised clinical exposure is fundamental to producing competent and skilled medical practitioners. It assesses the resources, patient load, and training opportunities available within the attached teaching hospital(s) and affiliated community health settings.
Overall Operational Explanation (as per Page 6 and Page 37-38):
Core Focus: Based on the curriculum laid down by the Medical Regulator for UG Program, students must be provided with mandatory hands-on experiences through clinical postings across a wide range of specialties.
Assessment of Clinical Material: Parameters are designed to assess the availability of required clinical materials (patients, investigative facilities, operative opportunities) in different specialties.
Assumption of Adequacy: The framework operates on the assumption that adequate patient loads and other clinical materials are highly needed for effective clinical training of medical students in real healthcare settings.
Alignment with Regulations:
Parameters are related to standards laid down by UGMEB for clinical training.
They also align with the "satisfactory teaching-learning environment" category of parameters as per the Assessment and Rating Regulations-2023.
Specifics of clinical posting hours (from second professional onwards) and internship requirements (1 year) as per regulations are considered.
Clinical exposure is in the form of "learner-doctor method of clinical training (Clerkship based Clinical Experiences)" in all phases, including early clinical exposures in the first professional year.
Regulation-2023 regarding Rural Health Training Centres (RHCs)/Community Health Centres (CHCs)/Urban Health Centres (UHCs) for internship training is also factored in (15 interns per centre as per CRMI regulations 2021).
Student Interaction: For several parameters, sampled students will be interacted with, aligning this criterion also with students’ feedback on various affairs of the Medical College.
Detailed Breakdown of Parameters under Criterion 2:
Parameter 2.1: Provision of Clinical Exposure/posting/internship to students/Interns vis-à-vis varied clinical specialties/Health Care setting (Weightage: 10; Qualitative)
Operational Explanation (Page 38):
Students must be mandatorily provided clinical experiences in diverse settings: UHCs, RHCs, and Hospital (Secondary & Tertiary level).
Specialties include: Medicine & allied (Gen. Med, Paediatrics, Resp. Med, Dermatology, Psychiatry), Surgery & allied (Gen. Surgery, Ortho-Trauma/PM&R, Ophthalmology, ENT, Dentistry, Anaesthesia), Obstetrics & Gynaecology, and Emergency Medicine.
Verification Process (Page 39):
Desktop assessment of information provided.
Interaction with the same set of students sampled under Criterion-1 or a different sampled set (Second Prof, Third Prof Part-1 & 2, Interns).
Interaction with the same set of clinical faculties sampled under Criterion-1.
Scoring Rubrics (Pages 39-40):
Sub-parameter 2.1.1: Specialty wise clinical posting duration vis-à-vis prescribed hours or weeks:
Level-1: Attached Teaching hospital is equipped with required Clinical specialties for student postings.
Level-1 plus: On average, 2nd Professional students are deputed for minimum prescribed hours/weeks in required specialties.
Level-2 plus: On average, 3rd Professional (Part-1 & 2) students are deputed for minimum prescribed hours/weeks in required specialties.
Level-3 plus: On average, interns are provided opportunity for required prescribed hours/weeks for rotatory internships in above-mentioned specialties (excluding Community Medicine).
Supporting Documents: Clinical Posting Rotation Schedules for all Professional Years, Rotatory Internship schedule, Logbooks of students and interns.
Sub-parameter 2.1.2: Rotatory Internship in Community Medicine:
Level-1: Medical College has arrangements for required number of Rural & Urban Health Centres for posting max 15 interns at a time.
Level-1 plus: All interns are deputed for internship in Urban Health Centres for required hours.
Level-2 plus: All interns are deputed for internship in Rural Health Centres for required hours.
Level-3 plus: Internship duration for Urban and Rural Health Centres is minimum 2 months.
Supporting Documents: Logbooks maintained by Interns, Internship Schedule for RHC/UHC.
Sub-parameter 2.1.3: Interaction with Sampled students by Assessment team:
Assesses student confirmation of their clinical postings in various specialties.
Level-1 to Level-4: Based on percentage of students confirming (<25%, 25-50%, 50-75%, >75%).
Supporting Documents: Logbooks of students, Case Records & History Taking records.
Sub-parameter 2.1.4: Interaction with Sampled Interns by Assessment team:
Similar to 2.1.3, but for interns confirming their clinical postings.
Weightage Calculation (Page 41): Formula aggregates scores from these 4 sub-parameters, multiplied by AW of 10.
Parameter 2.2: Specialty/Clinical Department wise OPD Attendance in the calendar year (Weightage: 15; Quantitative)
(Detailed under Module 4, Section 4.2 earlier, as an example of quantitative data normalization and sub-parameter application for OPD attendance across different specialty groups.)
Parameter 2.3: Specialty wise % of Bed Occupancy in Hospital in the last calendar year (Weightage: 15; Quantitative)
Operational Explanation (Page 44):
Quality of training is dependent on IPD patient loads. MSR-2023 mandates average indoor bed occupancy of minimum 80% per annum.
Daily count of patients remaining in beds at midnight is used for calculation.
Verification:
IPD admissions and Bed days data provided by College physically verified based on random sampling.
Sampled data auto-generated for sampled months/days for physical verification.
If data manipulation is found, IPD admissions and Bed days data shall be reduced.
Scoring Rubrics (Pages 44-46):
Normalization Formula (Page 44): ((x’) - (x)) / ((y) - (x)) * 100 where x’ is college's average per unit IPD Bed days data, y is max, x is min. Required teaching beds for UG or UG+PG used as denominator.
Level-1 to Level-4 (Page 45): Based on normalized score ranges (≤25, >25 to ≤50, >50 to <75, ≥75).
Sub-parameters (2.3.1 to 2.3.4): This scoring is applied separately for:
Percentage Bed occupancy across all clinical departments/specialties annually.
Percentage Bed occupancy across Medicine and allied specialties.
Percentage Bed occupancy across Surgery and allied specialties.
Percentage Bed Occupancy in Obstetrics and Gynaecology.
Supporting Documents: IPD Register, Cash Receipts, Physical Verification etc.
Weightage Calculation (Page 46): Formula aggregates scores from these 4 sub-parameters.
Parameter 2.4: Specialty wise Number of Minor surgeries performed in OT in past 1 Year (Weightage: 10; Quantitative)
(Detailed under Module 4, Section 4.2 earlier, as an example of scoring for operative procedures.)
Parameter 2.5: Specialty wise Number of Major surgeries performed in OT in past 1 Year (Weightage: 10; Quantitative)
Operational Explanation (Page 48): Similar to minor surgeries, emphasizes the need for varied clinical materials including major surgeries in OTs. All surgeries performed under General/Regional Anaesthesia are treated as Major surgeries.
Scoring Rubrics (Page 48):
Uses the same normalization formula and Level-1 to Level-4 score ranges as Parameter 2.4 (Minor Surgeries), but data is specific to major surgeries.
Supporting Documents: Data of Major operative works performed, Physical verification.
(Note: Parameter-4 refers to Minor Surgeries, Parameter-5 to Major Surgeries. The rubrics for 2.4 and 2.5 are essentially identical in structure, differing only in the type of surgery data used.)
Parameter 2.6: On Average Radiological Investigations performed in OPD & IPD together in past 1 Year (Weightage: 05; Quantitative)
Operational Explanation (Page 49): Based on curriculum, a well-equipped radio-diagnosis department is essential. Quality of clinical training in Radio-diagnosis depends on varied radiological investigations performed.
Scoring Rubrics (Page 49):
Normalization Formula: Standard formula ((x’) - (x)) / ((y) - (x)) * 100 where x’ is college's average per intake performed radiological investigations, y is max, x is min. UG or UG+PG intake considered.
Level-1 to Level-4 (Page 50): Based on normalized score ranges (≤25, >25 to ≤50, >50 to <75, ≥75).
Supporting Documents: Data of radiological investigations performed in Department of Radio-diagnosis.
Parameter 2.7: On Average Laboratory based Investigations performed in OPD & IPD together in past 1 Year (Weightage: 10; Quantitative)
Operational Explanation (Page 50): Well-equipped central/clinical laboratories (histopathology, cytopathology, haematology, immune pathology, microbiology, biochemistry) are crucial. Lab-based investigations are significant for clinical training.
MSR-2023 benchmarks:
Histopathology: samples ≥ 20% of total major surgeries.
Cytopathology: samples ≥ 1% of total hospital OPD.
Haematology, Clin Path, Clin Biochem: samples ≥ 15% of OPD and 30% of indoor beds.
Microbiology: samples ≥ 30% of indoor beds and 50% of total surgery cases.
Scoring Rubrics (Pages 50-52):
Normalization Formula (Page 50): Standard formula, applied to college-wise per unit average value of investigations.
Sub-parameters (2.7.1 to 2.7.5, Page 51-52): Scores (Level 1-4 based on normalized ranges) are calculated separately for the percentage of lab investigations performed in:
2.7.1: Histopathology Lab (denominator: total major surgeries).
2.7.2: Cytopathology Lab (denominator: total OPD attendance).
2.7.3: Haematology, Clinical Pathology, and Clinical Biochemistry Labs (denominator: total OPD attendance and IPD admissions).
2.7.4: Microbiology Labs (denominator: total IPD admissions).
2.7.5: Microbiology Labs (denominator: total surgery cases).
Supporting Documents: Cash Receipts, Lab entries, Investigation reports generated etc.
Weightage Calculation (Page 52): Formula aggregates scores from these 5 sub-parameters.
Parameter 2.8: On Average Daily Patient admission/attendance Casualty/Emergency Department in past 1 year (Weightage: 05; Quantitative)
Operational Explanation (Page 53): GMER-2023 mandates clinical training in Emergency Medicine for 2nd and 3rd professional students. Exposure to patient handling, triaging, stabilization is essential. Number of teaching beds as per sanctioned intake (MBBS Admission-2020) and MSR-2023 for emergency medicine beds are considered.
Scoring Rubrics (Pages 53-54):
Normalization Formula (Page 53): Standard formula ((x’) - (x)) / ((y) - (x)) * 100 applied separately for OPD attendance and IPD admission data in Emergency Department.
Sub-parameters (2.8.1 - IPD Admission, 2.8.2 - OPD Attendance):
Level-1 to Level-4: Based on normalized score ranges for IPD admission and OPD attendance vis-à-vis sanctioned intake.
Supporting Documents: OPD attendance and IPD admission records in Emergency Department, Cash Receipts.
Weightage Calculation (Page 54): Formula aggregates scores from these 2 sub-parameters.
Parameter 2.9: Provision of Community Postings at RHC/UHC under Community Medicine in past 1 year (Weightage: 05; Quantitative)
Operational Explanation (Page 55): MSR-2023 mentions RHC/UHC affiliation for internship training (15 interns/centre). Data pertaining to OPD attendance and IPD admission at these centres is captured.
Scoring Rubrics (Pages 55-56):
Normalization Formula (Page 55): Standard formula applied separately for UHC and RHC data, vis-à-vis per sanctioned intake.
Sub-parameters (2.9.1 - IPD UHC, 2.9.2 - OPD UHC, 2.9.3 - IPD RHC, 2.9.4 - OPD RHC):
Level-1 to Level-4: Based on normalized score ranges for IPD/OPD data at UHCs and RHCs vis-à-vis sanctioned intake.
Supporting Documents: OPD attendance and IPD admission records for UHC/RHC, Cash Receipts
Weightage Calculation (Page 57): Formula aggregates scores from these 4 sub-parameters.
Parameter 2.10: No. of patients treated in Intensive Care Areas/High Dependency Units in past 1 year (Weightage: 05; Quantitative)
Operational Explanation (Page 57): Students require experience in ICU, ICCU, RICU, PICU, NICU, Burns unit, post-op surgical ICU, Obstetric HDU/ICU. MSR specifies bed/unit requirements per sanctioned intake.
Scoring Rubrics (Pages 57-58):
Normalization Formula (Page 57): Standard formula for "Clinical Material-admission/stays in Critical Care Units" per intake.
Level-1 to Level-4 (Page 58): Based on normalized score ranges.
Supporting Documents: Patient Admission Records Critical Care Units and Cash Receipts etc.
Parameter 2.11: No. of deliveries (both normal & C-Section) in past 1 year (Weightage: 10; Quantitative)
Operational Explanation (Page 58): Mandatory clinical experiences in Obstetrics & Gynaecology. MSR specifies bed/unit requirements per sanctioned intake.
Scoring Rubrics (Pages 58-59):
Normalization Formula (Page 58): Standard formula applied separately for Normal and C-Section deliveries per intake.
Sub-parameters (2.11.1 - C-section, 2.11.2 - Normal deliveries):
Level-1 to Level-4: Based on normalized score ranges for C-section and normal deliveries vis-à-vis sanctioned intake.
Supporting Documents: IPD admissions in Department of Gynaecology, Records of deliveries performed etc.
Weightage Calculation (Page 60): Formula aggregates scores from these 2 sub-parameters.
Summary of Criterion 2:
Criterion 2 meticulously evaluates the core of practical medical training. It uses a predominantly quantitative approach to assess whether students are exposed to a sufficient volume and variety of clinical cases across different specialties and settings. Parameters cover OPD/IPD loads, bed occupancy, surgical and delivery numbers, investigative procedures, and experiences in emergency and intensive care. The emphasis on data normalization aims to ensure fair comparison. The qualitative aspect focuses on the structured provision of these postings and internship opportunities. The extensive documentation required underscores the need for meticulous record-keeping by medical colleges. This criterion, with its 100-point weightage and 11 detailed parameters, clearly signals the NMC's commitment to ensuring that graduates are clinically competent through robust, real-world training.
This criterion is crucial for evaluating the overall ecosystem provided by a medical college to foster effective learning, ensure student and staff well-being, and maintain a safe and compliant operational environment. It moves beyond just academic inputs to assess the holistic infrastructure and support systems. With a significant weightage of 130 points and all 11 parameters being qualitative, this criterion emphasizes the adequacy, functionality, and optimal utilization of facilities and systems, rather than mere quantitative presence.
Core Focus: To assess whether the medical college provides a conducive teaching-learning environment by examining three broad aspects:
Physical Environment: Availability and functionality of essential academic infrastructure like libraries, laboratories, and technology.
Psychological Environment: Measures to ensure student well-being, safety from harassment, and access to amenities that support a positive campus life.
Occupational Environment: Safety measures related to hospital operations, waste management, and regulatory compliance concerning potentially hazardous materials or procedures.
Alignment with Regulations: This criterion is explicitly linked to the "Establishment of Medical institutions, Assessment & Rating Regulations, 2023," indicating that the parameters are derived from or aligned with these comprehensive regulatory standards set by the NMC. It also draws from MSR-2023 for specific requirements.
Assumption of Need: The framework assumes that adequate and functional infrastructure (Central Library, Practical Labs, Skill Lab), safety measures (anti-ragging, BMW, fire safety), and student amenities (sports, hostels) are highly needed for creating a conducive learning environment and ensuring practical and clinical skill competency development.
Qualitative Assessment: All 11 parameters under this criterion are designated as "Qualitative." This implies that assessment will focus on:
Adequacy: Whether the facilities are sufficient for the student intake and program requirements.
Functionality: Whether the facilities are in good working order and usable.
Optimum Utilization: Whether the facilities are being effectively used by students and faculty.
Compliance: Adherence to NMC guidelines, MSR, and other relevant safety and regulatory standards.
Student and Faculty Interaction: Verification often involves interaction with sampled students and faculty to gauge their experiences and perceptions of these environmental factors.
Detailed Breakdown of Parameters under Criterion 3:
Operational Explanation (Page 61):
References MSR-2023 for Library requirements (number of Titles, Books, Journals linked to sanctioned intake).
Focuses on how available Library-based facilities (physical and digital) are being utilized.
Verification: Interaction with sampled students (1st, 2nd, 3rd Prof Part-1 & 2).
Scoring Rubrics (Pages 61-62):
Sub-parameter 3.1.1 (Available journals vis-à-vis sanctioned intake):
Level-1: Books (Print) in sync with MSR (30 books/intake, covering all subjects).
Level-1 plus: Books (Print) <5% more than MSR.
Level-2 plus: Books (Print) 5-10% more than MSR.
Level-3 plus: Books (Print) >10% more than MSR.
Supporting Documents: Accession Records & Registers, Stock Registers.
Sub-parameter 3.1.2 (Available textbooks vis-à-vis sanctioned intake):
Similar levels for Journals (Print & electronic both) being in sync with MSR (Min 1 journal/specialty, 1% of prescribed books & 15 journals each 50 students).
Supporting Documents: Annual subscriptions of e-Journals, Invoices, payment records.
Sub-parameter 3.1.3 (Automation & Creation of e-Library):
Level-1: Adopted automation/digitization (Library Management Software/applications).
Level-1 plus: Created e-Library with search & access facilities to e-resources (NML-ERMED, e-Sodh Sindhu, etc.).
Level-2 plus: If 50% of sampled students can show they can access e-resources procured by the college.
Level-3 plus: If college has an evolved mechanism for online tracking of e-resource usage by students/faculty.
Supporting Documents: Procurement records for e-resources, Electronic evidences of e-resource access.
Weightage Calculation (Page 62): Formula aggregates scores from these 3 sub-parameters.
Operational Explanation (Page 63):
References MSR-2023, Minimum Requirements for Annual MBBS Admission-2020, Standard Assessment Form for AY 2022-23 regarding 8 student practical labs (Histology, Clinical Physiology, Biochemistry, Histopathology & Cytopathology, Clinical Pathology & Haematology, Microbiology, Clinical Pharmacology, CAL in Pharmacology).
Verification: Interaction with sampled students (1st & 2nd Prof) and faculties (Human Anatomy, Physiology, Biochemistry, Pathology, Microbiology, Pharmacology, Forensic Medicine).
Scoring Rubrics (Pages 63-64):
Sub-parameter 3.2.1 (Availability and usages of Practical Laboratories by Faculty):
Level-1: Medical College has all 8 Practical Laboratories.
Level-1 plus: Min 25-50% sampled faculties produce documented evidence of DOAP Sessions conducted by them in concerned labs in past academic year.
Level-2 plus: Min 25-50% sampled faculties produce evidence of average number of OSPE/OSCE based formative assessments conducted per student in practical labs.
Level-3 plus: If in Level-2 & 3 both, >50% sampled faculties produce evidence.
Supporting Documents: Physical Verification of labs, Records of DOAP sessions, Records of assessments.
Sub-parameter 3.2.2 (Interaction with sampled students by Assessment Team):
Assesses student confirmation of lab usage for DOAP sessions, and if they had prior information/List of subject-wise OSCE/OSPE based formative assessments pre-planned.
Level-1 (25% students confirm): Labs used for DOAP sessions as per Teaching Schedule/Time Table.
Level-1 plus (25-50% students confirm): As above.
Level-2 plus (25% students confirm): Also confirm prior info/list of OSCE/OSPE pre-planned.
Level-3 plus (25-50% students confirm): As above.
Supporting Documents: Subject wise list of OSCE/OSPE, pre-planned DOAP sessions.
Weightage Calculation (Page 65): Formula aggregates scores from these 2 sub-parameters.
Operational Explanation (Page 65):
References MSR-2023: Every medical institution shall have a Skills Laboratory. Six (6) weeks of skill lab training including evaluation before students are posted to wards for clinical training shall be mandatory.
Verification: Interaction with sampled students (2nd & 3rd Prof Part-1 & 2) and clinical subject faculties.
Scoring Rubrics (Pages 65-66):
Sub-parameter 3.3.1 (Availability and usages of Skill Laboratory by Faculty):
Level-1: Medical College has Skill Laboratory.
Level-1 plus: Min 25-50% sampled faculties produce documented evidence of DOAP Sessions conducted by them in Skill Lab.
Level-2 plus: Min 25-50% sampled faculties produce evidence of average number of OSPE, OSCE & DOPS based formative assessments conducted per student in Skill Lab.
Level-3 plus: If in Level-2 & 3 both, >50% sampled faculties produce evidence.
Supporting Documents: Physical Verification of Skill Lab, Records of subject-wise DOAP sessions.
Sub-parameter 3.3.2 (Interaction with sampled students by Assessment Team):
Similar structure to 3.2.2, assessing student confirmation of Skill Lab usage for DOAP sessions and pre-planned OSCE/OSPE/DOPS assessments.
Weightage Calculation (Page 67): Formula aggregates scores from these 2 sub-parameters.
Operational Explanation (Page 67): Emphasizes availability of Audio Visual Aids facilities in Lecture Theatres, Teaching Rooms, Museums, Practical Labs, Skill Lab, MEU Dept/Unit, Central Library, and Practical Lab with CAL modules for Pharmacology.
Scoring Rubrics (Pages 67-68):
Level-1: College has all Lecture Theatres, Teaching Rooms, Labs (Practical & Skill) equipped with Audio-Visual Aids; Pharmacology Lab has CAL modules; MEU has required audio-visual facilities per DCF.
Level-1 plus: MEU uses LMS (Learning Management System) during in-house FDP or FDP for other colleges.
Level-2 plus: College has created LMS (e.g., MOODLE) for managing teaching-learning with features like assignments/submissions, formative assessments (knowledge & skill domains), making recorded DOAPs/Bed Side Clinics accessible.
Level-3 plus (Skill Laboratory equipped with any of):
VR/augmented reality-based Simulation software.
Computer-Based Simulations for AETCOM Modules.
Computer-Based Simulations for training Clinical Skills & Competences.
Computer-Based Simulations for conducting performance-based assessment (OSCE/OSPE/DOPS).
Supporting Documents: Physical Verification, user logs for LMS/Simulations.
Operational Explanation (Page 69): MSR-2023: adequate student amenities including common rooms (boys/girls separate), cafeteria, cultural activities, indoor games, student counselling, gymnasium, outdoor games, track events.
Verification: Interaction with sampled students.
Scoring Rubrics (Pages 69-71):
Sub-parameter 3.5.1 (Basic Student Amenities):
Level-1: Has cafeteria, separate common rooms, gymnasium, facilities for >1 indoor sport.
Plus-1 plus: Also has auditorium or multi-purpose hall for cultural activities.
Level-2 plus (Outdoor - Minimum 2): Facility for Badminton, Tennis Court, Basketball court, Volleyball, Football, Cricket, Athletic Track.
Level-2 plus (Outdoor - More than 2, different scale in notice): This is repeated with "more than two outdoor sports facilities" suggesting a typo and it should be Level-3 or 4 or higher number.
Supporting Documents: Physical verification.
Sub-parameter 3.5.2 (Organization of Annual Sports activities):
Level-1: College organizes Annual Sports activities each academic year.
Level-1 plus (Min 25% students confirm): Org. of Annual Sports with min 2 outdoor & 2 indoor sports, evidenced by recording/photos.
Level-2 plus (Min 25-50% students confirm): Org. of Annual Sports with >2 outdoor & >2 indoor sports.
Level-3 plus (>50% students confirm): Org. of Annual Sports with >4 outdoor & >4 indoor sports.
Supporting Documents: Documented evidences of organization (recording & photos).
Sub-parameter 3.5.3 (Organization of Annual Cultural Program): Similar levels and student confirmation for Annual Cultural Program with minimum 2, >2, or >4 activities.
Sub-parameter 3.5.4 (Measures for Hygiene and Sanitation):
Level-1: SOPs for maintenance sanitation & hygiene in Medical College (washrooms, Classrooms, campus, Cafeteria).
Level-1 plus: Also SOPs for Hostel (washrooms, Mess, Rooms).
Level-2 plus: Also SOPs for attached teaching hospital (washrooms, OPD area washrooms, cafeteria).
Level-3 plus: Min 70% sampled students satisfied with sanitation/cleanliness in College, Hospital, Hostel.
Weightage Calculation (Page 71): Formula aggregates scores from these 4 sub-parameters.
Operational Explanation (Page 71-72):
MSR-2023: Accommodation for at least 75% of all students enrolled and interns, and all girl students who request it. Independent furniture (chair, table, bed, full-size cupboard). Min 9 sq.m. area/student. Desirable: single/double rooms. Adequate recreational, dining, 24x7 security.
Scoring Rubrics (Page 72):
Level-1: Accommodation for 75% students/interns/residents per MSR. Hostel for Girls & Boys equipped with:
Level-1 plus (Separately for Girls & Boys Hostel):
24 hrs water supply & quality drinking water.
24 Hrs manned security.
Indoor/outdoor sports facilities.
Mess Facilities.
Adequate washroom & toilets.
Cleanliness & sanitation in entire block.
Computer systems and internet facilities.
Level-2 plus: Providing Hostel facilities with mess to all interns during internship at UHCs/RHCs.
Level-3 plus: Accommodation for 75% students/interns/residents in single or double occupancy rooms.
Supporting Documents: Physical verification, Records of occupancy & student accommodation.
Operational Explanation (Page 73):
References Ragging Prohibition Regulations (NMC, 2021) and gender harassment prevention measures (Hon’ble Supreme Court orders).
Scoring Rubrics (Pages 73-74):
Sub-parameter 3.7.1 (Anti-Ragging Measures):
Level-1: Constituted Anti-ragging Committee & Squad in sync with regulations, vigilant 24x7.
Level-1 plus: Contact numbers of accountable officers/faculty/staff shared/displayed. Zero tolerance policy disseminated (electronic/print). Students submit undertaking.
Level-2 plus: Counselling services for Freshers/others by professional counsellors for adjustment issues. Proactive measures for fresher-senior interaction (mentoring cells).
Level-3 plus: Squad identified potential hot spots. Reported Ragging investigated thoroughly & resolved timely.
Supporting Documents: Constitution of Committees/Squad, MOMs, Display/sharing of contacts, Evidences of counselling, investigations.
Sub-parameter 3.7.2 (Measures for Gender Harassment Prevention): Similar levels for functional POSH/Internal Complaint Committee, display of contacts, dissemination of zero tolerance policy, frequent sensitization/awareness programs based on POSH Act, and timely investigation/resolution of reported cases.
Weightage Calculation (Page 75): Formula aggregates scores from these 2 sub-parameters.
Operational Explanation (Page 76):
NMC (18th Oct 2021) advised colleges to constitute HICT (Health Infection Control Committee). NCDC-MoHFW notified guidelines for Hospital Infection Prevention & Control.
Scoring Rubrics (Pages 76-77):
Level-1: Constituted HICC (with senior microbiologist/medical faculties) OR HICT (Infection Control Officer/Nurse/microbiologist).
Level-1 Plus: HICC meets regularly/monthly for tracking policy implementation. HICT meets daily for measure implementation.
Level-2 plus: SOPs for essential policies (Antimicrobial, Surveillance, Disinfection, Isolation, Outbreak investigation) developed. All 100% staff (doctors, residents, interns, nursing, housekeeping) trained on SOPs for prevention & control of infections in all Clinical Depts & Critical Care Units.
Level-3 Plus: All policies-based SOPs are being implemented in all clinical departments and critical care units.
Supporting Documents: Constitution of HICC/HICT, MOMs, Daily audit records of HICT, SOPs, Training records, Evidences of SOP implementation.
Operational Explanation (Page 77): Compliance with AERB regulations for housing and operations of medical radiation/imaging facilities in hospital.
Scoring Rubrics (Pages 77-78):
Level-1: College adheres to AERB regulations for housing of medical radiation/imaging facilities.
Level-1 plus: All Medical Radiation equipment (X-Ray, CT, USG etc.) owned by teaching hospital is certified by AERB through e-LORA.
Level-2 plus: Stringent SOPs for operational & design safety for Radiation Equipment. Periodical audit of operational & design safety conducted (criteria: qualified person handling, protective accessories, TLD usage, preventive maintenance, periodic QA, regulatory updates, Patient Dose Management, Protection Measures).
Level-3 plus: College, based on periodical audit, identifies gaps (if any) and takes measures for enforcement of operational & design safety for radiation equipment.
Supporting Documents: Evidences for compliance with AERB regulations, e-LORA certificates, SOPs, Audit reports, Evidence of audit & enforcement.
Summary of Criterion 3:
Criterion 3, with its substantial weightage of 130 points and entirely qualitative parameters, underscores the NMC's commitment to ensuring a holistic, safe, and supportive environment for medical education. It goes beyond mere academic delivery to assess the very fabric of the institution – from the adequacy of libraries and labs, and the robustness of ICT infrastructure, to critical safety measures like anti-ragging, gender harassment prevention, biomedical waste management, infection control, radiation safety, and fire preparedness. Student amenities like sports and hostel facilities are also key. The assessment relies heavily on physical verification, review of policies and SOPs, examination of records (MOMs, training, audits), and interaction with students and staff to gauge functionality, utilization, and compliance. This criterion ensures that medical colleges are not just places of learning but also safe, well-managed, and psychologically supportive environments conducive to producing well-rounded and competent medical professionals.
This criterion is a cornerstone of the NMC's assessment framework, carrying a significant weightage of 140 points. It focuses directly on the students – the quality of those admitted, their ability to attain the prescribed competencies during their medical education, and their successful progression to further studies or professional roles. This criterion serves as a key indicator of the effectiveness of the teaching-learning process and the overall academic quality of the institution.
Overall Operational Explanation (as per Page 7 and specific parameter explanations):
Core Focus: As the title suggests, this criterion evaluates three interconnected aspects:
Admission: The academic caliber of students entering the MBBS program, primarily assessed through proxy indicators like NEET-UG scores.
Attainment of Competence: The ability of students to acquire and demonstrate the subject-wise competencies specified in the Competency-Based Medical Education (CBME) framework and GMER-2023. This is a direct measure of learning outcomes.
Progression: The success of students in advancing in their medical careers, particularly by securing admissions into postgraduate (PG) medical programs.
Link to CBME and GMER-2023: The assessment of competency attainment is directly tied to the competencies specified in GMER-2023. The framework emphasizes practical demonstration of these skills.
Proxy Indicators for Quality:
NEET-UG scores of admitted students are used as a proxy for the reputation of colleges among aspiring medical students.
NEET-PG scores of alumni are used as a proxy for the quality of the teaching-learning environment in the concerned medical colleges.
Alignment with Regulations: Parameters under this criterion serve as indicators for evaluating the quality of the teaching-learning process, academic excellence, and standards laid down by UGMEB for Medical Colleges, as mentioned in the Assessment and Rating Regulations-2023.
Detailed Breakdown of Parameters under Criterion 4:
Parameter 4.1: Demonstration of procedures by Sampled students in Procedure & Clinical Skill Laboratory/Simulated Setting (Weightage: 35; Qualitative)
Operational Explanation (Page 81-82):
With reference to CBME, students must have hands-on experience in acquiring subject-specific competencies in Practical Labs, Skill Labs, Simulated settings, and real clinical settings.
Skill Laboratories facilitate strengthening mastery over skills through practice, especially pre-requisites for actual clinical settings. Students develop mastery by operating over Standardized/Simulated Patients (SPs) or Computer-Based Simulations.
Sampling of Students (Page 82):
Total students sampled from all professional years: not less than 30 and not more than 60 (CLT consideration).
First & Second Professional Students: Randomly assigned Practical or clinical skill competencies/procedures to demonstrate in Skill Laboratory or simulated setting OR clinical setting as per subject requirements.
Third Professional (Part-1 & 2) and Interns: Assigned clinical skill competencies/procedures to demonstrate in real clinical setting (covered under Parameter 4.2).
Stratification: Sampling based on NEET Score or University Examinations (high performing, average performing, other than high & average performing), with one randomly sampled student from each stratum in a group task.
Group Tasks: For each subject, a different set of 3 students is auto-generated by the Assessment and Rating Portal. The group decides which steps/questions to perform/respond to.
If required articles/equipment are unavailable for an assigned task, a new task is assigned. If unavailable for both, "not performed" is submitted.
Scoring Rubrics (Pages 83-86):
Consists of 9 sub-parameters (4.1.1 to 4.1.9) focusing on group performance (by 3 students) in demonstrating tasks/procedures for specific subjects in their respective professional years and assessment settings:
4.1.1: Human Anatomy (First Prof - Practical Lab)
4.1.2: Physiology (First Prof - Practical Lab)
4.1.3: Biochemistry (First Prof - Practical Lab)
4.1.4: AETCOM (First Prof - Skill Lab)
4.1.5: (Typo in document, likely AETCOM for Second Prof or another First Prof subject - Skill Lab)
4.1.6: Pathology (Second Prof - Practical Lab)
4.1.7: Pharmacology (Second Prof - Practical/Clinical Lab)
4.1.8: Microbiology (Second Prof - Practical Lab)
4.1.9: Forensic Medicine and Toxicology (Second Prof - Practical Lab or Skill Lab)
(Note: AETCOM is listed twice, and some subject/setting combinations might need clarification based on the final framework.)
Levels 1-4: Based on the group's performance correctness: <40%, 41-60%, 61-80%, >80%.
Supporting Documents: App-based auto-generation of student groups, random assignment of group tasks to the group, app-based evaluation of group for assigned tasks.
Weightage Calculation (Page 87): The average of scores from the 9 sub-parameters is calculated, then multiplied by the Assigned Weightage (AW) of 35.
Parameter 4.2: Demonstration of Clinical procedures/clinical skill competency by sampled students/interns at Clinical site (Hospital) (Weightage: 35; Qualitative)
Operational Explanation (Page 87):
Based on NMC curriculum, students (Third Professional Part-1 & 2) are exposed to longitudinal patient care in outpatient/inpatient settings. Subject-specific clinical skills are assessed.
Interns, after summative exams, undergo rotatory internship and demonstrate clinical skill competencies in real clinical settings.
Sampling of Students (Page 88):
Same sampling strategy as Parameter 4.1 (stratified, group tasks, auto-generation).
Third Professional (Part-1 & 2) and Interns: Assessed in concerned clinical departments at the teaching hospital. If no patient available for an assigned task, a new assessment task is assigned.
Scoring Rubrics (Pages 88-97):
Extensive list of 27 sub-parameters (4.2.1 to 4.2.27) covering various clinical subjects for Third Professional Part-1, Part-2, and Interns. Each assesses group performance (by 3 students/interns) on assigned clinical tasks/procedures.
Examples of Subjects Covered: General Medicine, General Surgery, Obstetrics & Gynaecology, Paediatrics, Orthopaedics, Dermatology, Community Medicine, AETCOM, Psychiatry, ENT, Ophthalmology, Anaesthesiology & Radiodiagnosis, Casualty.
Assessment Setting: Concerned clinical department.
Levels 1-4: Based on the group's performance correctness: <40%, 41-60%, 61-80%, >80%.
Supporting Documents: Same as Parameter 4.1 (app-based systems for group generation, task assignment, evaluation).
Weightage Calculation (Page 97): The average of scores from all 27 sub-parameters is calculated, then multiplied by AW of 35.
Parameter 4.3: Average NEET Scores of students admitted to the UG Programme in the last 5 academic calendar. (Weightage: 20; Quantitative)
Operational Explanation (Page 98):
NEET scores of students admitted to the UG program in concerned colleges are captured.
College-wise average score is computed. This acts as a proxy for the reputation of the college among students.
Related to academic excellence category of assessment and rating criteria.
Only NEET-UG scores of students admitted under General Category/unreserved shall be considered.
Normalization (Page 98):
Formula: ((x’) - (x)) / ((y) - (x)) * 100
x’ = per student average score obtained by the concerned college.
y = Maximum obtained Average score by any college.
x = Minimum obtained Average score by any College.
The value may range between 0 to 100.
Scoring Rubrics (Page 99):
Level-1 to Level-4: Based on the normalized score ranges: ≤25, >25 to ≤50, >50 to <75, ≥75.
Note: For normalization, the average value shall be calculated with respect to sanctioned intakes for the UG Program.
Supporting Documents: Required data to be submitted by College.
Parameter 4.4: Average PG NEET Scores of UG Graduated students/UG alumni qualified minimum Cut-off Percentile in recently conducted NEET PG (Previous Year) (Weightage: 10; Quantitative)
Operational Explanation (Page 99):
Information on how many UG graduated students/alumni appeared for NEET-PG and how many cleared the minimum cut-off percentile in the past 1 year.
Proxy for the quality of teaching-learning process in College.
Related to standards of education and academic excellence category.
College submits NEET-PG data for students/alumni who appeared and qualified. NEET-PG scores of students qualified under General Category/unreserved considered.
Normalization (Page 100):
Same formula as Parameter 4.3.
Scoring Rubrics (Page 100):
Level-1 to Level-4: Based on normalized score ranges: ≤25, >25 to ≤50, >50 to <75, ≥75.
Note: For normalization, the average value shall be calculated with respect to sanctioned intakes for the UG Program.
Supporting Documents: Required data to be submitted by College.
Parameter 4.5: No. of UG Graduated students/UG Alumni taken admission in PG under recently organized AIQ (All India Quota) counselling by MCC and State Counselling in previous year (Weightage: 20; Quantitative)
Operational Explanation (Page 101):
Captures data on college-wise qualified NEET-PG students who secured admission in Medical College through AIQ (MCC) or State Counselling.
Related to student progression. Proxy for quality of teaching-learning and standards of education/academic excellence.
Scoring Rubrics (Page 101):
Sub-parameter 4.5.1 (PG Admission of students/alumni under AIQ MCC Counselling):
Level-1: <3% of sanctioned intake (of UG program) of students/alumni took PG/MD admission under AIQ MCC in last academic calendar.
Level-2: Equivalent to 3% to 5%.
Level-3: Equivalent to 6% to 8%.
Level-4: >8%.
Sub-parameter 4.5.2 (PG Admission of students/alumni under MCC and State Government Counselling - combined):
Level-1: <5% of sanctioned intake took PG/MD admission.
Level-2: 5% to 10%.
Level-3: 10% to 15%.
Level-4: >15%.
Percentage calculated with respect to sanctioned intakes for UG Program.
Supporting Documents: College to provide data of MBBS qualified NEET-PG students and their MD admission details.
Weightage Calculation (Page 102): Formula aggregates scores from these 2 sub-parameters.
Parameter 4.6: Performance of Students in Summative Assessment/Exit Examination in the last academic year (Weightage: 20; Quantitative)
Operational Explanation (Page 102):
NMC plans for National Exit Test (NeXT) as a yardstick for quality.
Until NeXT is conducted, performance in Summative Assessment (by Affiliating Body) and Internal Assessment (at Medical College level) will be captured.
Proxy for attainment of competencies & learning outcomes. Related to academic excellence & standards of medical education.
Scoring Rubrics (Pages 103-104):
Sub-parameter 4.6.1 (% of appeared students qualified minimum passing percentage in each theory):
Level-1 to Level-4: Based on percentage brackets (<50%, 50-70%, 71-90%, >90%) of students achieving this.
Sub-parameter 4.6.2 (% of appeared students qualified minimum passing percentage in Practical/Clinical Assessment of each subject):
Level-1 to Level-4: Similar percentage brackets.
Sub-parameter 4.6.3 (% of appeared students who have secured minimum passing percentage in each subject AND overall secured minimum 75% in theory Examination):
Level-1 to Level-4: Based on percentage brackets (<10%, 10-15%, 16-20%, >20%).
Sub-parameter 4.6.4 (% of appeared students who have secured minimum passing percentage in each Practical/Clinical Examination AND overall secured minimum 75% in Practical/Clinical Examination):
Level-1 to Level-4: Similar percentage brackets as 4.6.3.
*Students of all professional years considered.
Supporting Documents: Recently held Summative Assessment Data for theory & practical/clinical Examinations for students of all professional years.
Weightage Calculation (Page 104): Formula aggregates scores from these 4 sub-parameters.
Summary of Criterion 4:
Criterion 4, with its high weightage of 140 points, places a strong emphasis on student-centric outcomes. It meticulously assesses the entire student journey: the academic profile of incoming students (via NEET-UG), their ability to demonstrate acquired competencies in both simulated and real clinical environments (a significant portion of this criterion's focus), and their success in progressing to postgraduate studies (via NEET-PG performance and admissions) and clearing summative examinations. The use of proxy indicators like NEET scores reflects an attempt to quantify institutional reputation and teaching quality, while the direct assessment of competency demonstration provides tangible evidence of learning. The detailed, multi-level rubrics, especially for skill demonstration, indicate a rigorous and evidence-based evaluation process. This criterion is pivotal in determining how well a medical college is fulfilling its primary mandate of producing competent and successful medical graduates.
This criterion carries the highest weightage (160 points) in the entire framework, underscoring the paramount importance the NMC places on the faculty and the processes that facilitate teaching and learning. It delves into the adequacy, quality, stability, and academic engagement of the teaching staff, as well as the methods employed in the educational process and student participation. A strong performance in this criterion is indicative of a robust academic environment conducive to producing competent medical professionals.
Overall Operational Explanation (as per Page 8 and specific parameter explanations):
Core Focus: This criterion evaluates several critical aspects related to human resources (primarily faculty) and the pedagogical approaches used:
Faculty Strength and Stability: Ensuring an adequate number of qualified faculty vis-à-vis sanctioned student intake and regulatory norms, and minimizing faculty attrition.
Faculty Quality and Development: Assessing faculty qualifications beyond the minimum requirements, their engagement in academic presentations, and their contributions to educational materials.
Teaching-Learning Methods: Examining the pedagogical techniques employed by faculty in both theory and practical/clinical sessions.
Student and Faculty Achievements: Recognizing academic excellence through awards and presentations by both students and faculty.
Alignment with Regulations:
Parameters are related to standards laid down by UGMEB for Medical Colleges.
"Quality of faculty" is highlighted as essential for academic excellence.
Presentations by students and faculty in recognized conferences align with the rating parameter of “Participation of students/faculty in academic activities at national and international level” as per Assessment and Rating Regulations-2023.
Student Interaction: For several parameters, sampled students will be interacted with, linking this criterion also with students’ feedback on various affairs of the Medical College category of parameters.
Professional Bodies: Conferences organized by registered professional bodies (registered with EMRB-NMC or State Medical Council) are considered valid for academic presentations.
Detailed Breakdown of Parameters under Criterion 5:
Parameter 5.1: Teaching –learning methods being employed by sampled Faculties in their Theory classes (Weightage: 20; Qualitative)
Operational Explanation (Page 104-105):
Captures the type of teaching and training methods used by faculty in theory classes.
Emphasizes that methods should facilitate mastery over subject-specific competencies, primarily addressing the "Knows (K)" and "Knows How (KH)" levels of Miller’s Pyramid (cognitive development).
Verification (Page 105):
Interaction with 5% of faculties sampled from each department (drawn from the list sampled for Criterion-1).
Assessment based on observations pertaining to broad components of theory sessions:
Alignment of Theory sessions with prescribed Competencies.
Alignment of Theory sessions with Specific Objectives (K & KH levels of Miller’s Pyramid).
Use of Formative Assessment Methods (Formal & informal).
Usages of Audio-visual aids.
Scoring Rubrics (Pages 105-108):
Sub-parameter 5.1.1 (Alignment of Theory sessions with Competencies prescribed):
Level-1 to Level-4: Based on percentage of sampled faculties (<50%, 50-70%, 71-90%, >90%) able to produce documented or electronic evidence (recorded videos/live-streamed videos) of how theory classes are aligned with prescribed Competency.
Supporting Documents: Live streamed videos on NMC portal or Recorded videos by college for last 1 month.
Sub-parameter 5.1.2 (Alignment of Theory sessions with Specific Objectives... K & KH levels): Similar levels and evidence for alignment with specific objectives for K & KH levels.
Sub-parameter 5.1.3 (Formative Assessment Methods - Formal & informal): Similar levels for faculties producing evidence of using varied formative assessment methods (e.g., Clickers, Muddiest points, One Minute Paper).
Sub-parameter 5.1.4 (Interaction with students - Non-Clinical):
Assesses if sampled students can tell which type of competencies were taught, if they are in agreement with others about competences addressed, and their active involvement (asking questions/teaching skills).
Level-1 (<10 students tell type): If <10 students can tell.
Level-2 (Min 10 students agree): If min 10 students agree on competences addressed.
Level-2 plus (Min 10 students agree on assessment): Also agree on assessment methods/techniques used for evaluating progress.
Level-3 plus (Min 10 students, 10% active): Also min 10% students actively involved.
Supporting Documents: Recording of interaction with students.
Sub-parameter 5.1.5 (Interaction with students - Clinical): Similar structure to 5.1.4 but for clinical theory classes.
Sub-parameter 5.1.6 (Medium of Instruction - Regional Language):
Assesses if college has created provisions for organizing classes in regional language and provides Teaching Learning Materials in regional language, based on student confirmation.
Level-1 (<50% students confirm provisions): College created provisions.
Level-2 (>50% students confirm provisions): As above.
Level-3 (>70% students confirm provisions): As above.
Level-3 plus (>70% students confirm TLM): Also confirm college provides TLM in regional language.
Supporting Documents: Interaction with students, Documented evidence of TLM availability, Time table for regional language classes.
Weightage Calculation (Page 108): Formula aggregates scores from these 6 sub-parameters.
Parameter 5.2: Teaching –learning methods being employed by faculties for practical/clinical sessions in Laboratory/simulated setting/Bed side teaching (Weightage: 20; Qualitative)
Operational Explanation (Page 109):
Captures teaching-learning methods in practical/clinical sessions.
Focus on developing "Show (S)," "Show How (SH)," & "Perform independently (P)" levels of Miller’s Pyramid.
Assesses: Type of competencies, competence level (S, SH, P), teaching methods (DOAP, Bed Side teaching), and Skill Assessment based Formative Assessment tools (OMP, SNAPPS), use of simulations.
Verification (Page 109): Interaction with 5% faculties sampled from each department (from Criterion-1 list). Assessment based on components:
Alignment with prescribed Competencies & Specific Objectives (S, SH & P levels).
Use of Formative Assessment Methods.
Usages of Audio-visual aids.
Scoring Rubrics (Pages 110-112):
Sub-parameter 5.2.1 (Alignment of Practical sessions with Competencies prescribed):
Level-1 to Level-4: Based on % of sampled faculties (<50% to >90%) producing evidence (recorded/live videos) of how practical sessions are aligned with Competency.
Supporting Documents: Live streamed videos on NMC portal or Recorded videos by College for last 1 month.
Sub-parameter 5.2.2 (Alignment... S, SH & P levels of Miller’s Pyramid): Similar for alignment with S, SH & P levels.
Sub-parameter 5.2.3 (Formative Assessment Methods - Formal & informal): Similar for using formative assessment methods (e.g., OMP, SNAAPS, OSCE/OSPE, DOPS).
Sub-parameter 5.2.4 (Observation of Practical Sessions in Skill Laboratory):
Assesses clarity about type of Competency being developed and use of varied formative assessment methods.
Level-1 (No clarity): No clarity.
Level-2 (Clarity on Competency): Clarity on Competency type.
Level-2 plus (Varied formative assessment): Also varied formative assessment methods used to evaluate progress.
Level-3 plus (Min 10% students active): Also faculty involves min 10% students actively.
Supporting Documents: Recording of Practical sessions.
Sub-parameter 5.2.5 (Observation of Bedside Clinics or teaching): Similar structure to 5.2.4 for bedside clinics.
Weightage Calculation (Page 112): Formula aggregates scores from these 5 sub-parameters.
Parameter 5.3: Programmed wise number of recruited Faculty Staff vis-à-vis Regulatory specifications (Weightage: 20; Quantitative)
Operational Explanation (Page 113):
Deals with programme-wise required teaching staff (Professor, Associate Professor, Assistant Professor).
References Minimum Requirements for Annual MBBS Admission Regulations-2020 and Teachers Eligibility Qualifications in Medical Institutions Regulations-2022.
Emphasizes sufficient teachers for practical instruction/demonstration in small groups, and full-time work.
Scoring Rubrics (Page 113):
Level-1: If number of Faculty & Resident/Tutors/Demonstrators are falling short of required numbers in one or more Departments.
Level-2: If number of Faculty cadre wise & Resident/Tutors/Demonstrators are in alignment with required numbers in all departments separately and taken together.
Level-2 plus (Excess compensating deficiency): If in same department, excess Professors compensate deficiency in Associate/Assistant Profs, OR excess Associate Profs compensate deficiency in Assistant Profs.
Level-3 plus (Higher cadre proportion > MSR): If proportion of Faculties in higher cadre (Professor & Associate Professor) is higher than MSR in one or more departments.
Supporting Documents: Appointment & Joining Letters, Registration & Teacher ID Nos., Academic Qualifications & Professional Experiences.
Parameter 5.4: Programme wise number of Faculty Staff with additional professional qualifications other than minimum qualifications laid down by NMC (Weightage: 10; Quantitative)
Operational Explanation (Page 114):
Captures additional professional qualifications beyond NMC minimums (e.g., CPD).
Assumes these contribute to upgrading knowledge/skills.
Teacher Eligibility Qualifications-2022 mention BCME and Basic Course in Biomedical Research completion.
FAIMER-IFI, ACME (FIME), and IFME programs are treated as additional qualifications.
Scoring Rubrics (Page 114): (Based on Common DCF)
Level-1: All Faculty possess minimum academic qualification and experience as per Medical Regulator.
Level-2: 1% to 5% faculties possess additional qualifications as operationally defined.
Level-3: More than 5% faculties possess additional qualifications.
Level-4: More than 10% of total faculty possess additional qualifications.
Supporting Documents: Supporting Documents for additional qualifications.
Parameter 5.5: Staff attrition rate in past 2 Years (Weightage: 15; Quantitative)
(Detailed under Module 4, Section 4.4 earlier, as an example of assessing faculty stability.)
Parameter 5.6: No. of prestigious Awards/Grants instituted at International or National or State level availed by students in past 2 years (Weightage: 20; Quantitative)
Operational Explanation (Page 115):
Deals with prestigious awards availed by students (UG only for some categories). Fellowship Awards (Academic/Research) by recognized agencies considered.
Derived from Academic Excellence and Research category parameters (MARB-NMC). Proxy for quality teaching-learning and research ecosystem.
Computation of Scores (Pages 115-118):
ICMR STS (Short Term Studentship) included.
Scores: International Award = 90, National Award = 60, State Level Awards = 30. Average score per student computed across all MBBS professional years.
Detailed Categories for Scoring (different from above simple scores):
Category-1 (200 score/entry): Academic/Research/Studentship Grant/Awards by Central Govt. Body/Institutes or International Org (WHO etc.) (involving National Level screening/selections).
Category-2 (100 score/entry): By State Govt. Ministries/Depts/Institutes or State Health Science Univ. (State Level screening/selections).
Category-3 (100 score/entry): By Deemed University/Private University (minimum University & State Level screening/selections). Certificates need Registrar/VC signature.
Category-4 (25 score/entry): By national Professional Body/National Associations (recognized in Med Edu Fields) without Central/State govt. collaboration.
Normalization Formula (Page 118): Standard formula ((x’) - (x)) / ((y) - (x)) * 100 based on average score per sanctioned intake.
Scoring Levels (Page 119):
Level-1 to Level-4: Based on normalized score ranges (≥25, >25 to ≥50, >50 to <75, ≤75 - Note: Typo likely, should be >75 or similar progression).
Supporting Documents: Certificate/Award Letters verified for past 2 calendar years.
Parameter 5.7: No. of prestigious Awards instituted at International or National or State level availed by Faculty of College in last 2 Years (Weightage: 15; Quantitative)
Operational Explanation (Page 119): Similar to 5.6 but for faculty. Fellowship Awards (Academic/Research) considered. Proxy for quality teaching-learning and research ecosystem.
Computation & Scoring (Pages 119-121):
Follows the exact same categorization and scoring per entry as Parameter 5.6 (Category 1-4 with scores 250, 100, 100, 100 respectively - note: 250 for Cat-1 here vs 200 for students).
Minimum regular full-time faculties required for UG/PG considered for averaging out per Faculty value.
Normalization Formula (Page 121): Standard formula based on average score per Faculty.
Scoring Levels (Page 122): Similar normalized score ranges as 5.6.
Parameter 5.8: Number of Extra/Co-curricular Student awards instituted at State/National/International level availed by students in past 2 years (for UG Student only) (Weightage: 10; Quantitative)
Operational Explanation (Page 122): Deals with Extra/Co-curricular Awards by recognized association/agency. Derived from Academic Excellence and satisfactory teaching-learning environment categories. Proxy for holistic development.
Computation & Scoring (Pages 122-124):
Category-1 (250 score/entry): Sports, Visual & Performing Arts, Outreach/Social Service Awards by Central Govt. Body/Institutes or International Org (WHO etc.) (min national level screening).
Category-2 (100 score/entry): Similar awards by State Govt. Body or Institutes at State level.
Category-3 (100 score/entry): Similar awards by Deemed/Private University (min state/University level screening).
Category-4 (100 score/entry): Similar awards by Professional Body/National Associations recognized by Regulatory Body (min state level screening).
Normalization Formula (Page 124): Standard formula based on average score per sanctioned intake.
Scoring Levels (Page 124): Standard normalized score ranges.
Parameter 5.9: Number of Faculty Staff contributed in Designing of Course Materials (Online & offline) at International or National or State level recognized platforms in past 2 years (Weightage: 15; Quantitative)
Operational Explanation (Page 124): Contribution to designing courses/materials on recognized platforms (SWAYAM, NMC FDPs, Foreign/Central/State Univ., CPD for EMRB-NMC/SMC recognized bodies). Proxy for faculty quality and academic excellence.
Computation & Scoring (Pages 125-127):
Category-1 (250 score/entry): Designed Study/course/training Materials OR contributed to policy/regulatory documents for Central Govt. Body/Institutes/International Org (WHO etc.), OR contributed to online Platforms (SWAYAM), OR worked as Resource Person for Central Govt. body.
Category-2 (100 score/entry): Similar for State Govt. Body/Institutes or State managed Platforms.
Category-3 (100 score/entry): Similar for Deemed/Private Universities or their online Platforms.
Category-4 (25 score/entry): Designed study/course/materials OR worked as resource persons for national Professional Bodies/associations (without govt. collaboration).
Category-5 (International - Health Education): Designed Study/course/materials OR worked as Resource Persons for Academic/Research Institutes with QS/Times Higher ranking.
Condition-1 (150 score): Ranking up to 250 in past two years.
Condition-2 (50 score): Ranking beyond 250 in past two years.
Multiple entry of same faculty for same assigned tasks not considered multiple times.
Normalization Formula (Page 127): Standard formula based on average score per faculty (considering min regular full-time faculties).
Scoring Levels (Page 127-128): Standard normalized score ranges.
Parameter 5.10: Number of Paper Presentations by Faculty Staff in recognized International/National & State level Conferences/Competitions in last 2 Years (Weightage: 10; Quantitative)
Operational Explanation (Page 128): Invited Speaker, oral & postal presentations by Faculty in reputed conferences/competitions (organized by professional bodies registered with EMRB-NMC/SMC). Proxy for academic excellence & teaching-learning quality.
Computation & Scoring (Pages 128-130):
Category-1 (250 score/entry): Seminars/conferences sponsored/organized by Central Govt. Body/National Institutes/International Org (WHO etc.) (min national level screening/participation).
Category-2 (100 score/entry): By State Govt. Ministry/Dept/Body or State Health Science Univ/State Affiliating Univ. (min state level screening).
Category-3 (100 score/entry): By Deemed/Private University and Colleges.
Category-4 (25 score/entry): By national Professional Body/Associations (without govt. collaboration).
Category-5 (International - Health Education): Seminars/Conferences by Academic/Research Institutes with QS/Times Higher ranking (Physical Mode only and abroad travelling involved).
Condition-1 (150 score): Ranking up to 250.
Condition-2 (50 score): Ranking beyond 250.
Per Faculty Max 5 valid entries. Oral presentation = full score, Postal = half score.
Normalization Formula (Page 130): Standard formula based on average score per faculty.
Scoring Levels (Page 131): Standard normalized score ranges.
Parameter 5.11: Number of Academic Presentations by Students in recognized International/National & State level Conferences/Competitions in last 2 Years (UG Student only) (Weightage: 05; Quantitative)
Operational Explanation (Page 131): Similar to 5.10 but for UG students.
Computation & Scoring (Pages 132-133):
Follows the exact same categorization and scoring per entry as Parameter 5.10 (Category 1-5 with relevant scores).
Per student Max 4 valid entries.
Normalization Formula (Page 133): Standard formula based on average score per student.
Scoring Levels (Page 134): Standard normalized score ranges.
Summary of Criterion 5:
Criterion 5, with its substantial 160-point weightage, forms the bedrock of evaluating the academic vitality and human capital of a medical college. It meticulously examines the faculty – their numbers, qualifications, stability (attrition), teaching methodologies in both theoretical and practical settings, and their contributions to research, course development, and academic discourse through presentations and awards. It also extends this scrutiny to student achievements in academic presentations and awards. The criterion employs a mix of qualitative assessments (for teaching methods through direct observation and interaction) and a large number of quantitative parameters that often involve complex scoring based on institutional categories and normalization against peers. This comprehensive approach ensures that colleges are not only meeting regulatory staffing norms but are also fostering an environment of continuous professional development, pedagogical innovation, and active scholarly engagement for both faculty and students. The high weightage signifies that the quality and dynamism of human resources are considered central to the quality of medical education delivered.
Criterion 6: Assessment Policy: Formative, Internal & Summative Assessment (Total Allocated Weightage: 60 points)
This criterion, though carrying a moderate weightage of 60 points, is critical as it delves into the heart of how student learning and competency attainment are measured and guided throughout their medical education. It specifically focuses on the college's policies and practices concerning formative, internal, and summative assessments, ensuring they align with the principles of Competency-Based Medical Education (CBME). All parameters under this criterion are qualitative, emphasizing the processes, alignment, and effective utilization of assessment strategies.
Overall Operational Explanation (as per Page 8 and specific parameter explanations):
Core Focus: This criterion predominantly deals with:
Formative Assessment: Ongoing assessments designed to monitor student learning and provide feedback for improvement.
Internal Assessment (IA): Periodic examinations conducted by the college to evaluate student progress before the final university/board examinations.
Summative Assessment: (Implicitly) How internal assessments prepare students for and align with the principles of summative evaluation, especially within a CBME framework.
Alignment with Competency-Based Curriculum: Ensuring that all assessment methods are geared towards evaluating the competencies (knowledge, skills, attitude/professionalism) prescribed by medical regulators.
Reference to Guidelines: The assessment practices are evaluated against guidelines and curriculum prescribed by UGMEB-NMC. Specific documents like "Module-3 Assessment for CISP-2019," "Logbook guideline-2019," and "GMER-2019 & 2023" are referred to for deriving parameters and specifying rubrics.
Emphasis on Process and Feedback: The criterion highlights the importance of continuous and comprehensive assessment, the use of diverse assessment tools, tracking learning progress (e.g., through Logbooks), and providing remedial support based on performance analysis.
Student Interaction: For several parameters, sampled students will be interacted with, aligning this criterion also with students’ feedback on various affairs of the Medical College.
Detailed Breakdown of Parameters under Criterion 6:
Parameter 6.1: Regular Periodical Internal Assessment (IA) Examinations for theory & Practical/Clinical vis-à-vis NMC Guideline for Competence Based Assessment (CBA) (Weightage: 15; Qualitative)
Operational Explanation (Page 135):
Medical Regulators mandate regular periodical IA examinations prior to professional year-end Summative Assessment.
Within CBME, IAs have paramount importance. They should assess all aspects of learning (Cognitive, Affective, Psychomotor).
Components of IA include:
Theory Paper: Aligned with competencies; questions framed for K (Knows) and KH (Knows How) levels of Miller’s Pyramid.
Practical/Clinical Examination: Assesses Skill Competencies, Communication, Attitude; uses multiple tools (OSCE, OSPE, DOPS, Mini CEX) for S (Shows), SH (Shows How), P (Performs) levels.
Log Book based Assessment: Certifiable competencies certified in Log Books.
Professional Development Programme (PDP)-AETCOM: Competencies integrated with theory & practical/clinical IAs.
Practical/clinical examination data (hard copy/electronic) for all professional years checked on random basis.
Verification (Page 136):
Faculty Interaction: Department-wise 25% sampled faculties (from Criterion-1 list).
Student Interaction: Same set of students sampled under Criterion-1.
Scoring Rubrics (Pages 136-140):
Sub-parameter 6.1.1: Conduct of required number of Internal Assessment (IA) Examinations with supporting evidences:
Level-1: Conducting less than prescribed number of IA professional year wise.
Level-2: Conducting IA professional year wise as per prescribed numbers by Regulator.
Level-2 plus: Able to produce documented evidences (Exam Papers for all subjects, verified answer sheets) for recently conducted IA for all professional years.
Level-3 plus: Able to produce documented/electronic evidences (student wise filled assessment tools for Practical/Clinical skill assessments, Internal Assessments conducted End of Postings - EOP) for sampled students of all professional years.
Supporting Documents: Answer sheets of sampled batches, filled assessment tools by faculties for Practical/Clinical Skill assessments.
Sub-parameter 6.1.2: Planning & Conduct of Theory Papers:
Assesses if sampled faculties show documented evidence of guidelines for developing Theory papers aligned with subject-specific K/KH competency levels (Miller’s Pyramid, Bloom’s Taxonomy).
Level-1 to Level-4: Based on percentage of faculties (<50%, 50-70%, 71-90%, >90%) demonstrating this.
Supporting Documents: Documented Guidelines, Developed subject-wise Theory Papers for recently held IA.
Sub-parameter 6.1.3: Planning & Conduct of Practical/Clinical Skill Assessments: Similar levels for evidence of guidelines for Practical/Clinical Assessments aligned with S/SH/P competency levels.
Sub-parameter 6.1.4: Planning and Conducting assessment of Professional Development Program (PDP)-AETCOM:
Assesses inclusion of PDP-AETCOM competencies in IA Theory papers, Practical/Clinical assessments, and viva-voce.
Level-1 to Level-4: Based on level of integration and testing of acquired skills.
Supporting Documents: Clinical & Practical Assessments etc. (implying AETCOM related assessment tools).
Sub-parameter 6.1.5: Objective & Structured scoring process for Theory and Practical/Clinical Assessments in IA Examinations:
Assesses if sampled faculties can produce objective, structured & self-explanatory scoring criteria & process.
Level-1 to Level-4: Based on percentage of faculties demonstrating this.
Supporting Documents: Scoring Sheets of recently held Theory and practical & clinical assessments, filled assessment tools/sheets for scoring student performance.
Sub-parameter 6.1.6: Scoring Performance of Students in Practical/Clinical Assessments in IA Examinations vis-à-vis Interaction with 5% sampled students:
Assesses if sampled students can tell which type of assessment tools were used and how (scoring criteria) they were given scores in recently conducted Practical/Clinical IA.
Level-1 to Level-4: Based on percentage of students demonstrating this.
Weightage Calculation (Page 140): Formula aggregates scores from these 6 sub-parameters.
Parameter 6.2: Usages of Formative Assessment methods vis-à-vis Continuous and Comprehensive Assessment Process (Weightage: 15; Qualitative)
Operational Explanation (Page 140):
Formative assessments are integral to CBME for theory, practical, and clinical teaching.
Continuous and ongoing formative assessments provide positive/developmental feedback.
Learning evidence should be collected using multiple assessment tools and from multiple settings (Demo Room, Lab, Simulated setting, real clinical setting).
Verification (Page 141):
Faculty Interaction: Department-wise 25% sampled faculties (from Criterion-1 list).
Student Interaction: Same set of students sampled under Criterion-1.
Scoring Rubrics (Pages 141-142):
Sub-parameter 6.2.1: Planning and conduct of Formative Assessments vis-à-vis theory teaching:
Assesses if sampled faculties produce documented evidence about how they plan and conduct formative assessments (type of tools, relation to K/KH levels, periodicity).
Level-1 (<50% faculties): Evidence of planning & conducting.
Level-2 (50-70% faculties): Evidence of type of tools used, relation to K/KH Miller’s Pyramid levels, periodicity.
Level-3 (71-90% faculties): As above.
Level-4 (>90% faculties): As above.
Supporting Documents: Documented evidences of usages of multiple assessment tools for theory (One Minute Paper, Clickers, Muddiest Point, MCQs, Clinical Case Discussion, Problem based Questioning, Assignments, Online Assessment through LMS etc.).
Sub-parameter 6.2.2: Planning and conduct of Formative Assessments vis-à-vis Practical & clinical teaching: Similar levels for formative assessments in practical/clinical teaching (tools, relation to S/SH/P Miller’s Pyramid levels, periodicity).
Supporting Documents: Evidences for tools like OMP, SNAPPS, OSCE, OSPE, DOPS, Mini-CEX.
Sub-parameter 6.2.3: Conducts of Formative Assessments for positive & developmental feedback to students vis-à-vis Interaction with 5% sampled students:
Assesses if sampled students can tell how frequently they are provided with positive feedback to reflect over their performance vis-à-vis competencies.
Level-1 to Level-4: Based on percentage of students confirming this (<50%, 50-70%, 71-90%, >90%).
Weightage Calculation (Page 143): Formula aggregates scores from these 3 sub-parameters.
Parameter 6.3: Log Books & Portfolio based Tracking learning progress of students vis-à-vis laid down clinical Skills/Competences (Weightage: 15; Qualitative)
Operational Explanation (Page 143):
Log book based capturing of learning evidence is emphasized in CBME. NMC defines Log Book as a verified record of learner progression.
Should document competence-wise performance/learning, activities allowing demonstration, learning contexts (Skill Lab, Seminars, Symposia, patient/community interactions), rating for competence, and faculty decision.
Practices checked:
Subject-wise finalization of essential certifiable competences in Log book.
Activity organized for acquiring & showcasing achieved competence.
Rating given by faculty.
Decision on certification, repeating activity, or remediation.
Weightage given to Log Book based assessment in IA/Examinations.
Verification (Page 144):
Faculty Interaction: Department-wise 25% sampled faculties (from Criterion-1 list).
Student Interaction: Same set of students sampled under Criterion-1.
Scoring Rubrics (Pages 144-146):
Sub-parameter 6.3.1: Guideline for Activities and Certifiable Competencies to be placed in Logbooks:
Assesses if sampled faculties can produce documented evidence about which type of activities and certifiable competencies are to be placed in logbooks by students for their subjects.
Level-1 to Level-4: Based on percentage of faculties demonstrating this (<50%, 50-70%, 71-90%, >90%).
Supporting Documents: Documented evidences about how Logbooks will be created and what type of activities/competencies will be placed subject-wise for each professional year.
Sub-parameter 6.3.2: Methods for Certification of acquiring of certifiable competencies by students:
Assesses if faculties produce evidence about: type of activities organized for certifiable competencies, criteria for rating student meeting expectations, and process for further accomplishment/repeat activity/remediation.
Level-1 to Level-4: Based on percentage of faculties demonstrating this.
Supporting Documents: Logbooks maintained by students.
Sub-parameter 6.3.3: Maintenance of Logbooks by students:
Assesses if sampled students can show maintained logbooks as per instruction and along with signature of concerned faculties.
Level-1 to Level-4: Based on percentage of students demonstrating this.
Supporting Documents: Logbooks maintained by students by all Professional Years.
Sub-parameter 6.3.4: Process of Certification of Competencies vis-à-vis Interaction with 5% sampled students:
Assesses if sampled students can tell with supporting evidences about type of activities required, type of criteria used by faculties for certification, and whether they have completed accomplishment of certifiable competencies or not.
Level-1 to Level-4: Based on percentage of students demonstrating this.
Weightage Calculation (Page 146): Formula aggregates scores from these 4 sub-parameters.
Parameter 6.4: Department wise Analysis & reviewing of Students’ Performance in Formative & Internal Assessments and taking corrective actions (Weightage: 15; Qualitative)
Operational Explanation (Page 147):
Aims to capture information regarding the analysis of formative and summative assessment data.
Categorization of students into high, average, and low performers.
Focus on identifying students needing additional support/remedial measures and providing advanced/challenging tasks for exceptional performers.
Verification (Page 147):
Faculty Interaction: Same set of faculties sampled for Parameter-1.
Student Interaction: Same set of students sampled for Parameter-1, or different set.
Scoring Rubrics (Pages 147-148):
Sub-parameter 6.4.1: Post Analysis of Formative & Internal Assessment data:
Level-1: College has ad-hoc policy for analysing post formative & internal assessment examination data.
Level-2: College has established policy for analysing this data.
Level-2 plus: Based on analysis, College is identifying High, Average, and Below expectation performing students.
Level-3 plus: College is organizing remedial sessions for students not performing as per expectations and further tracking data based impact of remedial sessions.
Supporting Documents: Documented evidences about analysis of assessment data for identifying students not performing as per expectations.
Sub-parameter 6.4.2: Remedial or additional support based on Post analysis... vis-à-vis interaction with 5% sampled students:
Assesses if sampled students can tell whether College is organizing remedial sessions for students performing below expectations and providing any type of additional supports to students who are performing exceptionally well.
Level-1 to Level-4: Based on percentage of students confirming this (<50%, 50-70%, 71-90%, >90%).
Supporting Documents: Professional Year Wise List of below expectation performing students, Evidence of organization of remedial sessions, Evidence of Advance Learning Programme or Capsules for students showing remarkable performance.
Weightage Calculation (Page 148): Formula aggregates scores from these 2 sub-parameters.
Summary of Criterion 6:
Criterion 6 provides a focused examination of a medical college's assessment ecosystem, emphasizing its alignment with CBME principles. It moves beyond simply checking if exams are held, to scrutinizing how they are planned and conducted, what tools are used, how learning is tracked (Logbooks), and critically, how assessment data is used to provide feedback and support student learning (formative assessment, remedial actions). The entirely qualitative nature of its parameters (though often verified by looking at percentages of faculty/students demonstrating a practice) signifies a deep dive into the processes and policies that underpin a robust assessment system. Effective implementation of this criterion ensures that assessment is not merely an endpoint, but an integral and ongoing part of the teaching-learning cycle, crucial for developing competent medical professionals.
Criterion 7: Research Output & Impact (Total Allocated Weightage: 100 points)
This criterion, with a significant weightage of 100 points, evaluates the medical institution's contribution to the advancement of medical knowledge through research. It assesses not just the quantity of research output but also its quality, impact, and the institution's ability to secure funding and translate research into tangible outcomes like patents and clinical trials. All parameters under this criterion are quantitative, focusing on measurable outputs and achievements.
Overall Operational Explanation (as per Page 9 and Page 149):
Core Focus: This criterion is directly related to the assessment and rating parameter: "The research output of the medical institution that has contributed to the existing knowledge and the research impact created by the medical institution," as mentioned in the Assessment and Rating Regulations-2023.
Scope of Assessment: It encompasses various facets of research:
Publications: Number of research papers, citations, and impact factors of journals.
Funded Projects: Extramural research projects (both industry/non-government and government-funded).
Intellectual Property: Patents filed and granted, and their commercialization.
Clinical Trials: Initiation and progression of clinical trials.
Emphasis on Quantity and Quality: While deriving parameters, the framework has taken into account that parameters should relate to both the quantity and quality of research activities within the Medical College/Institution.
Faculty Focus (Important Note - Page 149): Under this criterion, all provided data must be related to Faculties recruited for the MBBS programme only. This is a crucial qualifier, meaning research output from faculty exclusively involved in PG programs or other non-MBBS courses might not be considered, or may be treated differently.
Normalization: For parameters like paper publications (7.1), citation scores (7.2), and impact factors (7.3), a normalization formula is applied to the raw data to ensure fair comparison. This involves comparing the college's per-faculty average score against the minimum and maximum scores achieved by any college.
Detailed Breakdown of Parameters under Criterion 7:
Normalization Formula for Parameters 7.1, 7.2, & 7.3 (Page 149):
((x’) - (x)) / ((y) - (x)) * 100
x’ = for concerned college, per faculty obtained average score (Faculty-Professors, Associate Professors & Assistant Professors who are recruited for MBBS Program).
y = Maximum Value for “Maximum per faculty obtained average score by any college on this parameter” across all colleges.
x = Minimum value for “Minimum per faculty obtained average score by any College on this parameter” across all colleges.
Note: College-wise average score per faculty shall be calculated. If a college offers both UG & PG Programs, minimum required full-time faculties for both programs shall be considered for averaging, otherwise only UG Program faculty.
Parameter 7.1: Total number of research paper publications by Faculty Staff with Institutional Affiliation in last 2 Years in indexed Journals (Weightage: 15; Quantitative)
Operational Explanation (Page 150):
Considers total research paper publications by Faculty Staff and Students in the past 2 years.
Indexed Journals: Publications must be in journals indexed in databases like Medline/PubMed, Central Science Citation Index, Science Citation Index, Expanded Embase, Scopus, Directory of Open Access Journals (DoAJ).
List of indexed journals taken from ‘Teacher Eligibility Qualifications in Medical Institutions-2022’.
Scoring for Individual Papers (Page 150):
Q1 Category Journal: 200 score per valid entry.
Q2 Category Journal: 150 score per valid entry.
Q3 & Q4 Category Journal: 100 score per valid entry.
Important Notes on Scoring Publications:
A research paper is considered once for scoring, regardless of multiple authors from the college.
If any Faculty/Teacher is not associated with the College at the time of submission, acceptance, AND publication, the entry is NOT considered.
If the First author is not associated with the concerned College, 50% of the assigned score is given. Other scores may be decided by the Assessment Team. (Applicable for 7.1, 7.2, 7.3).
Scoring Rubrics (Page 151):
Level-1 to Level-4: Based on the normalized per faculty average score ranges (≤25, >25 to ≤50, >50 to <75, ≥75).
Supporting Documents: Submission details, uploaded soft copies of research papers published in indexed journals for the given database only.
Parameter 7.2: Cumulative Citation Scores of research papers published in indexed journals in last 2 years (Weightage: 15; Quantitative)
Operational Explanation (Page 151): Cumulative citation scores of all published research papers (as mentioned in 7.1) in the last 2 years.
Scoring for Individual Citations (Page 151):
Each citation of a published paper in Q1 & Q2 Category journal: 100 score.
Each citation of a research paper published in Q3 & Q4 Category journal: 50 score.
Normalization and Scoring Rubrics (Page 151-152):
Uses the same normalization formula and Level-1 to Level-4 score ranges as Parameter 7.1.
Supporting Documents: Submission details, uploaded soft copies of research papers, and evidence of citations (e.g., from Scopus, Web of Science).
Parameter 7.3: Cumulative Impact Factors of all publications published by the Institute in indexed Journals in the last 2 Years (Weightage: 05; Quantitative)
Operational Explanation (Page 152):
Impact Factor of a journal is a yardstick for its significance and rank.
Considers cumulative Impact Factors of journals where papers (validated under 7.1) were published.
Cumulative Impact Factors shall be multiplied by 500. Then per faculty average score calculated.
Normalization and Scoring Rubrics (Page 152-153):
Uses the same normalization formula and Level-1 to Level-4 score ranges as Parameter 7.1.
Supporting Documents: Submission details, uploaded soft copies of research papers published in indexed journals for given database, along with the Impact Factor of the journal in which the paper was published.
Parameter 7.4: No. of patents/Design Registration filed by the Institution in the last 2 years (Weightage: 10; Quantitative)
Operational Explanation (Page 153):
Deals with patents filed and design registrations by the Medical College.
A patent is a statutory right for an invention.
References regulations from Medical Regulator for assessment and rating.
Scoring (Page 153):
Each Patent with publication number filed by the College in the past 2 Calendar years: 50 scores per entry.
Each design registration with certificate only: 100 scores per entry.
Important Notes on Scoring Patents/Designs:
Mandatory that Faculty/Teachers are associated with the concerned College when claiming scores.
If Faculty/Teacher not associated at time of filing/award, entry not considered.
If First author not associated, 50% score assigned (or as decided by Assessment Team).
Normalization (Page 154): Standard formula based on per faculty average score. If college offers UG & PG, minimum required Faculties for both programs considered.
Scoring Rubrics (Page 154):
Level-1 to Level-4: Based on normalized score ranges.
Supporting Documents: Date of Filing of Patent Applications along with Publication Number, Design Granted Certificate.
Parameter 7.5: No. of patents granted, converted to products and commercialized in last 2 years (Weightage: 10; Quantitative)
Operational Explanation (Page 155): Related to patent applications filed in the past 2 years that have been granted, and further, if converted to products and commercialized.
Scoring (Page 155):
Each Patent grant (filed by College in past 2 Calendar years): 200 scores per entry.
Patent converted to product and commercialized: 300 scores per entry.
Important Notes: Same as Parameter 7.4 regarding faculty association and first author status.
Normalization and Scoring Rubrics (Page 155):
Uses the same normalization formula and Level-1 to Level-4 score ranges as Parameter 7.4.
Supporting Documents: Date of Granting of Patent along with Patent Number, evidence of product conversion and commercialization.
Parameter 7.6: No. of extramural funded projects completed/ongoing in collaboration with Industry/Non-government (National, State/International) funding agencies in last 2 Financial Years (Weightage: 10; Quantitative)
Operational Explanation (Page 156):
Number of projects completed or ongoing, funded by Industry or non-government agencies (India/abroad), or in collaboration with academic/research institutes.
Score given to each project vis-à-vis total sanctioned amount received in past 2 financial years in the account of College or Investigators (Full-time Faculty or Students).
Scoring for Individual Projects (Page 156): Based on sanctioned amount:
≤ 1 Lakh: 50 score
≤ 5 Lakh: 100 score
≤ 10 Lakhs: 150 score
≤ 25 Lakhs: 200 score
25 Lakhs to ≤ 50 Lakhs: 250 score
≥ 51 Lakhs to ≤ 1 crore: 300 score
1 crore to ≤ 2 crore: 350 score
2 crore: 400 score
Note: Cumulative scores averaged out with respect to required full-time Faculties for sanctioned intakes of UG & PG Programs (if PG Program is offered).
Scoring Rubrics (Page 157):
Level-1 to Level-4: Based on the (presumably normalized) per faculty average score ranges.
Supporting Documents: Sanctioned Letter, Fund Release Letter, Utilization Letter, Project Completion Certificate.
Parameter 7.7: No. of extramural funded projects completed/ongoing/being funded by government agency in India and abroad like CSIR, ICMR & DST etc. in last 2 Financial Years (Weightage: 20; Quantitative)
Operational Explanation (Page 157): Similar to 7.6, but funding is from government agencies (CSIR, ICMR, DST etc.).
Scoring for Individual Projects (Page 157-158): Same scoring scale based on sanctioned amount as Parameter 7.6 (50 to 400 score).
Note: Same averaging note as 7.6.
Scoring Rubrics (Page 158):
Level-1 to Level-4: Based on (presumably normalized) per faculty average score ranges.
Supporting Documents: Same as 7.6.
Parameter 7.8: No. of clinical trials initiated/going on/approved for different phases in last 2 calendar year (Weightage: 15; Quantitative)
Operational Explanation (Page 158):
Clinical trials initiated, ongoing, or approved for different phases progressively by regulatory body (DGCI through CDSCO).
Scoring for Individual Trials (Page 159):
Category-1 (Non-regulatory, CTRI registered): 150 score
Category-2 (Regulatory, CTRI registered): 300 score
Category-3 (Industry funded): 150 score
Category-4 (ICMR/designated govt. body funded): 250 score
Category-5 (Approved for 2nd phase by DGCI/CDSCO): 400 score
Category-6 (Approved for 3rd phase by DGCI/CDSCO): 600 score
Category-7 (Approved for 4th phase by DGCI/CDSCO): 800 score
Important Notes on Scoring Trials:
Only trials where full-time faculty of the concerned college is PI or Co-investigator. Faculty must be associated. If PI is associated, full score; if only Co-investigator is associated, 50% score.
An entry qualifying for multiple categories gets the highest score.
Cumulative scores averaged out with respect to required full-time Faculties for UG & PG Programs (if PG offered).
Scoring Rubrics (Page 159-160):
Level-1 to Level-4: Based on (presumably normalized) per faculty average score ranges.
Supporting Documents: Sanctioned Letter, Fund Release Letter, Utilization Letter, Project Completion Certificate, DGCI approval letter, CTRI registration certificate etc.
Summary of Criterion 7:
Criterion 7 provides a robust, quantitative assessment of a medical college's research ecosystem and its tangible outputs. It emphasizes not only the volume of research (publications, projects, patents) but also its quality and impact (journal tiers, citations, impact factors, funding amounts, clinical trial phases, commercialization). The clear distinction for faculty associated with the MBBS program and specific scoring rules for first authorship or PI/Co-I status aim to ensure relevance and fairness. The use of normalization based on faculty strength allows for comparison across institutions of varying sizes. The comprehensive list of supporting documents required for each parameter indicates a rigorous, evidence-based verification process. A strong performance here signifies an institution actively contributing to medical science and innovation.
Criterion 8: Financial-Resource: Recurring & Non-recurring Expenditures (Total Allocated Weightage: 100 points)
This criterion, with a substantial weightage of 100 points, uses financial expenditure as a proxy to evaluate the institution's commitment to providing and maintaining an effective teaching-learning environment and adequate facilities for clinical training. It assumes that appropriate financial investment in key areas translates into better resources, infrastructure, and support for students and faculty, aligning with standards laid down by the UGMEB and MARB-NMC. All 10 parameters under this criterion are quantitative, focusing on the amounts spent in the previous financial year.
Overall Operational Explanation (as per Page 9 and Page 160):
Core Focus: To assess how effectively financial resources are utilized to support and enhance:
Teaching-Learning Processes: Procurement of learning resources (books, journals), lab consumables, maintenance of academic equipment.
Clinical Training of Students: Procurement and maintenance of equipment in clinical laboratories and operation theatres, indicating support for handling patient loads.
Faculty and Staff Support: Expenditure on salaries and professional development.
Infrastructure and Safety: Maintenance of campus facilities, including safety measures and sustainable practices like renewable energy.
Proxy for Effectiveness: The parameters are designed as proxies for how effectively teaching-learning activities are organized and how students are provided with learning experiences in laboratory-based simulated setups and actual clinical settings.
Alignment with Standards: This criterion is directly related to the alignment of activities with standards laid down by UGMEB, which is a major category of assessment and rating as per the Assessment and Rating Regulations-2023.
Normalization Approach (Page 160):
In the absence of fixed minimum value benchmarks by the Medical Regulator for many of these expenditure parameters, college-wise obtained values will be subjected to normalization.
The standard formula used is: ((x’) - (x)) / ((y) - (x)) * 100
x’ = for concerned college, average score (per student or per faculty, depending on the parameter or sub-parameter details) on the parameter.
y = Maximum obtained per faculty or student score by any college on this parameter” across all colleges.
x = Minimum obtained per faculty or per student score by any College on this parameter” across all colleges.
Important Notes for Normalization (Page 160-161):
Per Student Calculation: If the college offers PG Programmes in addition to MBBS, and if submitted financial data is applicable for all programmes, then for computation of average value per student, all students across all courses being offered will be taken into consideration.
Per Faculty Calculation: If submitted financial data is applicable for all faculties (Professor, Associate Professor, Assistant Professor) recruited for all programmes (UG & PG), then for computation of average value per faculty, all faculties on College’s roll will be considered.
Clinical Department Parameters: For parameters applicable to Clinical departments of the teaching hospital, the average value per unit will be computed using denominators like total laboratory-based investigations, total radiological investigations, total operative works, total OPD/IPD admissions, as per the nature of the parameter.
Cut-off Ranges for Performance Levels (Page 161): After normalization, the score (ranging 0-100) will determine the performance level for each parameter:
Level-1: Normalized score ≤ 25
Level-2: Normalized score >25 to ≤ 50
Level-3: Normalized score >50 to < 75
Level-4: Normalized score ≥ 75
Supporting Documents (General): "As per requirement of concerned parameter" – meaning specific documents will be needed for each.
Detailed Breakdown of Parameters under Criterion 8 (All expenditures are for the "previous financial year"):
Parameter 8.1: Total amount of Books & Journals and other Learning Resources purchased in previous financial year (Weightage: 10; Quantitative)
Operational Explanation (Page 162):
Data captured regarding amount spent on addition of new books & subscription of new journals etc.
Proxy for enriching knowledge & skills of students by making new literature available.
Total amount spent in INR in the past 1 year captured college-wise.
Note: Average value per sanctioned intake calculated. If PG programmes offered, PG sanctioned intake also considered for computing average value per student.
Supporting Documents (Page 162): Subject to List of Books & Journals procured/subscribed, Invoices related to procurement and subscriptions (print/electronic) of Library Resources, Purchase Order, Tax Invoices and Receipts etc.
Parameter 8.2: Total amount spent on procurement of consumable Lab based materials in previous financial year (Weightage: 10; Quantitative)
Operational Explanation (Page 162): As per curriculum, 8 Practical Labs and Skill Lab are required. This parameter assesses spending on consumables for these. It also covers consumables for clinical labs in the attached teaching hospital, which is a proxy for patient load.
Breakdown for Calculation (Page 163): Amount spent in INR separately for:
Consumables in Laboratories set up in Medical College (8 Practical Labs & Skill Lab).
Consumables in clinical Laboratories set up in attached teaching hospital.
Specific Normalization Notes (Page 163):
For 8.2.1 (College Labs): Average value computed considering sanctioned intake for UG (and PG if offered). Normalization formula from Criterion-8 start.
For 8.2.2 (Hospital Labs): Average value computed by dividing total amount spent by total number of OPD attendance and IPD admissions in the last 1 calendar year. Normalization formula from Criterion-8 start.
Sub-parameters (Page 163-164):
8.2.1: Amount spent on consumables for Laboratories set-up in Medical College.
8.2.2: Amount spent on consumables for Clinical Laboratories set-up in attached Teaching Hospital.
Supporting Documents (Page 163): Purchase order, Invoices, receipts and stock registers for Practical Laboratories & Skill Laboratory / Clinical Laboratories.
Weightage Calculation (Page 164): Formula aggregates scores from these 2 sub-parameters.
Parameter 8.3: Total amount spent on maintenance of radiological equipment in previous financial year (Weightage: 10; Quantitative)
Operational Explanation (Page 164): NMC guidelines: facilities for conventional/static/portable X-rays, fluoroscopy, contrast studies, ultra-sonography, CT. Spending on maintenance indicates usage due to patient load, essential for clinical training.
Normalization Note (Page 165): First, compute average value by dividing total amount spent on maintenance by total number of radiological investigations carried out in the last 1 calendar year. Then apply standard Criterion-8 normalization.
Supporting Documents (Page 165): Invoices and AMCs etc., Annual Budget, audited balance sheets etc.
Parameter 8.4: Total amount spent on procurement of non-consumable equipment in Clinical Laboratories in attached teaching hospital in previous financial year (Weightage: 10; Quantitative)
Operational Explanation (Page 166): Captures data for procurement of non-consumable lab equipment beyond MSR. Indicates augmentation of resources due to increased patient footfall, essential for clinical training.
Normalization Note (Page 166): First, compute average value by dividing total amount spent by total number of OPD attendance and IPD admissions in the last 1 calendar year. Then apply standard Criterion-8 normalization.
Supporting Documents (Page 166): Invoices, purchase order and receipts etc., Annual Budget, audited balance sheets etc.
Parameter 8.5: Total amount spent on consumable resources for indoor & outdoor sports in previous financial year (Weightage: 10; Quantitative)
Operational Explanation (Page 166): NMC guidelines: provision for indoor games, gymnasium, playground for outdoor games/track events. College submits expenditure data on procuring consumables for these. Pertains to satisfactory teaching-learning environment.
Normalization Note (Page 167): Average value per sanctioned intake calculated (UG, and PG if offered).
Supporting Documents (Page 167): Annual Budget and audited balance sheets etc., Purchase order, invoices and receipts.
Parameter 8.6: Amount spent on salary for Faculty Staff and residents in the previous financial year (Weightage: 20; Quantitative)
Operational Explanation (Page 167): College submits amount spent on gross salary of Faculty and residents.
Normalization Note (Page 167):
Compute average value per Unit: amount spent on cadre-wise gross salary of Faculty staff (numerator) and sanctioned intakes for UG (or UG+PG if offered) (denominator).
Computed per unit average value is subjected to normalization.
Specific Note on Data (Page 168): Salary data for full-time Faculty provided. Supporting documents like Form 16B, audited balance sheets (if required). Colleges categorized (Govt./Private) and levels determined after normalization.
Supporting Documents (Page 168): Form 16B for each Faculty for previous Financial Year, other documents as mentioned.
Parameter 8.7: Percentage of Electricity (Units) vis-à-vis total consumed electricity in the previous financial year obtained from renewable energy (solar/wind) (Weightage: 10; Quantitative)
Operational Explanation (Page 168): Consumption of electricity (units) in Medical College including teaching hospital, and percentage obtained from renewable sources.
Scoring Rubrics (Page 168-169) (Direct Percentage, Not Normalized via standard formula):
Level-1: <5% of total consumed electricity from renewable energy.
Level-2: >5% from renewable energy.
Level-3: >10% from renewable energy.
Level-4: >15% from renewable energy.
Supporting Documents (Page 168): Electricity Bills for last financial year (separately for College & Hospital), Evidences of electricity produced from renewable sources.
Parameter 8.8: Amount spent on procurement of consumable materials for clinical/ operational works in OT for meeting demands of patients in previous financial year (Weightage: 10; Quantitative)
Operational Explanation (Page 169): Amount spent on OT consumables indicates patient loads, proxy for adequate clinical material for training. Related to MSR for effective clinical training.
Normalization Note (Page 169): First, compute average value by dividing total amount spent on OT consumables by total number of operative works (Minor & Major) carried out in the last 1 year. Then apply standard Criterion-8 normalization.
Supporting Documents (Page 169): Purchase order, Invoices, receipts and stock registers etc., Annual Budget, audited balance sheets etc.
Parameter 8.9: Amount spent on maintenance of non-consumable equipment in OT in the previous financial year (Weightage: 05; Quantitative)
Operational Explanation (Page 170): Maintenance of OT non-consumable equipment due to patient footfalls. Essential proxy for availability of clinical material.
Normalization Note (Page 170): Similar to 8.8, average value based on total operative works.
Supporting Documents (Page 170): Purchase order, Invoices, receipts and AMCs etc., Annual Budget, audited balance sheets etc.
Parameter 8.10: Amount spent on strengthening of Safety Measures in Campus in the previous financial year (Weightage: 05; Quantitative)
Operational Explanation (Page 171): Amount spent by Medical College for strengthening safety measures in entire campus including teaching hospital. Related to MSR for providing safe physical environment.
Breakdown for Data Submission (Page 171-172): Total amount spent in INR separately for:
Medical College Heads: Maintenance of Fire Safety equipment, existing facilities for Quality Drinking water, CCTVs, electrical gadgets.
Hostels (Boys & Girls both) Heads: Maintenance of Fire Safety equipment, existing facilities for Quality Drinking water, CCTVs, electrical gadgets, salary for outsourced security Staff.
Attached Teaching Hospital Heads: Maintenance of Fire Safety equipment, existing facilities for Quality Drinking water, CCTVs, electrical gadgets, Lifts, existing facilities for BMW Management, procurement of wheelchairs & trolleys with railings.
Specific Normalization Notes (Page 172):
For 8.10.1 (College & Hostels): Average value per student (UG, or UG+PG if offered). Total amount spent on College & Hostels safety divided by sanctioned intakes. Normalization formula from Criterion-8 start.
For 8.10.2 (Attached Teaching Hospital): Average value computed by dividing total amount spent on hospital safety by total number of IPD admissions in past 1 calendar year. Normalization formula from Criterion-8 start.
Sub-parameters (Page 172-173):
8.10.1: Total amount spent on strengthening safety measures in Medical College and Hostels both.
8.10.2: Total amount spent on strengthening safety measures in attached teaching hospital.
Supporting Documents (Page 172-173): Invoices and AMCs, Annual Budget and audited balance sheets etc.
Weightage Calculation (Page 173): Formula aggregates scores from these 2 sub-parameters.
Summary of Criterion 8:
Criterion 8 provides a financial lens through which the operational effectiveness and resource adequacy of a medical college are assessed. By quantifying expenditure across diverse areas—from learning resources and laboratory consumables to equipment maintenance, faculty salaries, and campus safety—the framework seeks to gauge an institution's tangible commitment to quality. The consistent use of normalization, often per student intake or per unit of clinical activity (like OPD visits or surgeries), attempts to create a level playing field for colleges of different sizes and scopes. While financial input doesn't solely define quality, this criterion operates on the premise that judicious and adequate spending is a necessary enabler of a high-standard teaching-learning and clinical environment. The detailed breakdown of expenditure heads and specific normalization methods indicates a granular approach to understanding how financial resources underpin the educational mission.
Criterion 9: Community Outreach Programs (Total Allocated Weightage: 40 points)
This criterion, with a weightage of 40 points, evaluates the medical college's engagement with the community, particularly focusing on its role in addressing local health needs and providing students with practical experience in community medicine. A key component of this is the Family Adoption Programme (FAP), which has been integrated as an essential part of the curriculum. This criterion reflects the NMC's emphasis on social accountability and the importance of training doctors who are sensitive to and skilled in managing community health issues.
Overall Operational Explanation (as per Page 9 and specific parameter explanations):
Core Focus: To assess the extent and impact of community outreach programs conducted by the Medical College.
Family Adoption Programme (FAP):
This is a central element. As per GMER-2023, FAP is an essential component of the Curriculum.
The parameters are derived from guidelines laid down by UGMEB for FAP.
NMC notice dated 31st March 2022 directed medical colleges to implement the Competency-Based Medical Education Curriculum (which includes FAP) from batch 2021-22 onwards.
Key Activities Assessed:
Adoption of families by students in villages.
Organization of diagnostic camps for screening and identification of diseases, ill-health, and malnutrition in these adopted communities.
Organization of follow-up diagnostic camps.
The impact of these interventions on the health outcomes of the adopted families.
Alignment with Regulations: Parameters are related to the assessment and rating category of "Standards laid down by UGMEB" as per Assessment and Rating Regulations-2023.
Student Interaction: For several parameters, sampled students will be interacted with, aligning this criterion also with students’ feedback.
Detailed Breakdown of Parameters under Criterion 9:
Parameter 9.1: No. of families adopted by students and organization of diagnostic camps in villages of adopted families for screening & identification of disease/ill-health & malnutrition (Weightage: 15; Quantitative)
Operational Explanation (Page 173):
Focuses on the number of families adopted by students under FAP.
Emphasizes that family adoption should preferably include villages not covered by PHCs adopted by Medical Colleges (to extend reach).
Medical Diagnostic Camps are to be organized in villages wherefrom students have adopted families for screening and identification of disease/ill-health & malnutrition.
GMER-2023 specifies professional year-wise competencies, teaching-learning methods, and assessment methods for FAP.
Verification Process (Page 174):
Interaction with the same set of students sampled under Parameter-1 of Criterion-1 (or a similar sampling method for different students).
Interaction with 25% of faculties sampled from the Department of Community Medicine (under Parameter-1 of Criterion-1).
Scoring Rubrics (Pages 174-175):
Sub-parameter 9.1.1: Adoption of Families by each professional year students:
Level-1: All students from batches 2022-23 onwards have adopted a minimum of 3 families in villages as per NMC guidelines.
Level-1 plus: For a minimum of 25% or more adopted families, the Medical College has submitted demographic data and Health Data or Clinical Examination data.
Level-2 plus: For a MINIMUM 50% of adopted families, similar data submitted.
Level-3 plus: For a MINIMUM 75% of adopted families, similar data submitted.
Supporting Documents: Documented evidences related to Family Survey for Demographic Data, Health Profile and Treatment History Records of adopted families, Evidences for organization of Medical Camp or Community Clinics, Mobile no. of Head of Family member for cross-verifications, Logbooks of students.
Sub-parameter 9.1.2: Organization of Medical Camp for Clinical Examination & screening of Family members of each adopted family:
Level-1: Minimum one Diagnostic Camp and minimum one annual follow-up diagnostic camp is organized.
Level-1 plus: Minimum 25% or more sampled students are able to produce documented evidences about how adopted family wise clinical examination or health data has been captured and further family members are screened with which type of diseases.
Level-2 plus: 25% to 50% of sampled students can produce such evidence.
Level-3 plus: More than 50% of sampled students can produce such evidence.
Supporting Documents: Same as mentioned above (implying records of camps, screening data etc.).
Weightage Calculation (Page 175): Formula aggregates scores from these 2 sub-parameters, multiplied by AW of 15.
Parameter 9.2: Impact of family adoption/therapeutic intervention on health outcomes of adopted family (Weightage: 25; Qualitative)
Operational Explanation (Page 175):
This parameter aims to capture data pertaining to the impact of the family adoption programme and any therapeutic interventions on the health outcomes of the adopted family members. This moves beyond mere activity (adoption/camps) to assess effectiveness.
Verification Process (Page 176):
Interaction with the same set of students sampled under Parameter-1 of Criterion-1 (or a similar sampling method for different students).
Interaction with 25% of faculties sampled from the Department of Community Medicine (under Parameter-1 of Criterion-1).
Scoring Rubrics (Pages 176-177):
Sub-parameter 9.2.1: Therapeutic interventions or treatment or suggested remedial measures on health outcome for allocated family members:
Level-1: Medical College has submitted clinical examination data for all families adopted by students of batches 2022-23 onwards.
Level-1 plus: Minimum 25% of sampled students are able to produce documented evidences about the type of therapeutic intervention or treatments that were given to allocated family members as per clinical examination data if they were requiring (i.e., if interventions were needed and provided).
Level-2 plus: 25% to 50% of sampled students can produce such evidence.
Level-3 plus: More than 50% of sampled students can produce such evidence.
Supporting Documents: Family Survey data for demographic profiles, Clinical Examination data family wise, Medical History records of adopted Family, Mobile no. of Head of Family member for cross-verifications, Logbooks of students.
Sub-parameter 9.2.2: Impact and follow-up of suggested therapeutic interventions or treatment:
Assesses documentation of follow-up status and improvements in health outcomes.
Level-1: Medical College has submitted (data) for students of all batches from session 2022-23 onwards pertaining to follow-up status of therapeutic interventions or treatment if given to any adopted family members as per their clinical examination data.
Level-1 plus: Minimum 25% of sampled students are able to produce documented evidences about how they are following up on suggested treatments or remedial measures if given to adopted family members and further what is improvement in their health outcomes.
Level-2 plus: 25% to 50% of sampled students can produce such evidence.
Level-3 plus: More than 50% of sampled students can produce documented evidence of follow-up and health outcome improvement.
Supporting Documents: Medical History records, Records of suggested Treatment or remedial measures, Mobile no. of Head of Family member for cross-verifications, Logbooks of students, Follow up diagnostic camp or community clinics organized.
Weightage Calculation (Page 177): Formula aggregates scores from these 2 sub-parameters, multiplied by AW of 25.
Summary of Criterion 9:
Criterion 9, with its 40-point weightage, specifically evaluates a medical college's commitment to community engagement and social responsibility, primarily through the lens of the Family Adoption Programme (FAP). It assesses the systematic adoption of families by students, the organization of diagnostic and screening camps within these communities, and, crucially, the impact of these interventions on the health outcomes of the adopted families.
The assessment methodology combines:
Quantitative aspects: Such as the number of families adopted and the submission of health data (Parameter 9.1).
Qualitative aspects: Focused on the actual therapeutic interventions, follow-up mechanisms, and the documented impact on health, verified through student and faculty interactions and review of records (Parameter 9.2).
The verification process heavily relies on interactions with students (from all professional years involved in FAP) and faculty from the Department of Community Medicine, alongside thorough documentation review including family surveys, health records, camp reports, and student logbooks.
This criterion ensures that medical education is not confined within hospital walls but extends into the community, providing students with invaluable experience in primary healthcare, understanding social determinants of health, and contributing to local health improvement efforts. The emphasis on "impact" in Parameter 9.2 signifies a move beyond mere activity reporting to evaluating the effectiveness of these outreach programs.
Criterion 10: Quality Assurance System (QAS) (Total Allocated Weightage: 30 points)
This criterion, although carrying a relatively lower weightage of 30 points compared to some others, is vital as it assesses the formal mechanisms and processes a medical college has in place to ensure and continuously improve the quality of its services and operations. It looks at both external validations (accreditations) and internal systems for maintaining standards and safety.
Overall Operational Explanation (as per Page 10 and specific parameter explanations):
Core Focus: To evaluate the comprehensiveness and effectiveness of the Quality Assurance System (QAS) within the medical college and its associated hospital. This encompasses:
External Accreditations: Recognition by specialized national or international bodies for laboratories and hospitals, indicating adherence to established quality benchmarks.
Regulatory Compliance: Ensuring all necessary legal licenses for operation are current and valid.
Internal Quality & Safety Committees: The establishment and functioning of key committees responsible for specific aspects of quality and safety, such as pharmacovigilance and antimicrobial stewardship.
Standard Operating Procedures (SOPs): How the medical college ensures compliance with SOPs established by specialized accreditation bodies for the functioning of laboratories and hospitals.
Safety Measures: Implementation of safety measures in accordance with NMC guidelines.
Alignment with Regulations: Parameters are related to the assessment and rating category of "Standards laid down by UGMEB" as per Assessment and Rating Regulations-2023.
Proactive Approach: The criterion looks for a proactive approach from medical colleges towards quality assurance, not just reactive measures.
Detailed Breakdown of Parameters under Criterion 10:
Parameter 10.1: Accreditations of Clinical Laboratories by NABL or nationally recognized body (Weightage: 05; Quantitative)
Operational Explanation (Page 178):
Focuses on whether available clinical laboratories in the attached teaching hospital are accredited by the National Accreditation Board for Testing and Calibration Laboratories (NABL) or any other nationally recognized accreditation body.
Accreditation signifies that the laboratory meets established quality and competence standards for testing.
Scoring Rubrics (Page 178-179):
Level-0: If each Laboratory is accredited for less than 25% of tests being carried out.
Level-1: If each Laboratory is accredited for 25% to 50% of tests being carried out.
Level-2: If each Laboratory is accredited for more than 50% to 75% of tests being carried out.
Level-3: If each Laboratory is accredited for more than 75% of tests being carried out.
Supporting Documents: NABL certificate (or certificate from other recognized body), scope of accreditation detailing accredited tests.
Parameter 10.2: NABH Accreditation of parent/attached hospital (Weightage: 10; Quantitative)
Operational Explanation (Page 179):
Captures whether the attached teaching hospital/parent hospital is accredited by the National Accreditation Board for Hospitals & Healthcare Providers (NABH) or any other recognized accreditation body.
NABH accreditation signifies adherence to comprehensive quality standards in patient care and hospital management.
Scoring Rubrics (Page 179):
Level-1: If teaching hospital is under accreditation process of NABH/any other national body and certificate is awaiting.
Level-2: If teaching hospital is accredited with entry-level accreditation of NABH/any other national body.
Level-2 plus: If teaching hospital has been granted full NABH/any other national body accreditation status.
Level-3 plus: If Full accreditation status of NABH/any other national accreditation is valid, and validity has not expired.
Supporting Documents: NABH certificate or accreditation certificate of any other national body, evidence of application status if awaiting.
Parameter 10.3: Legal Licenses- (Availability & Validity as per NMC guidelines) (Weightage: 05; Quantitative)
Operational Explanation (Page 180):
Requires capturing information pertaining to different Legal Licenses prescribed by NMC.
NMC has prescribed about 54 types of Licensing required for Medical Colleges. This parameter relates to standards laid down by NMC for compliance.
Scoring Rubrics (Page 180):
Level-1: If the Medical College is complying with less than 25% of Licenses prescribed by Medical Regulator.
Level-2: If complying with more than minimum 50% of Licenses. (Note: "more than minimum 50%" might imply a range, e.g., 50-74%, as Level 3 starts at 75%).
Level-3: If complying with minimum 75% of Licenses.
Level-4: If complying with 100% of Licenses.
Supporting Documents: Certificate/License copy of each License issued from the competent authority, clearly showing validity.
Parameter 10.4: Pharmacovigilance Committee (Weightage: 05; Qualitative)
Operational Explanation (Page 180):
This is a mandatory requirement for Medical Colleges to set up a Pharmacovigilance Committee.
Aims at capturing information related to the constitution and functioning of this Committee, which is crucial for monitoring and reporting adverse drug reactions (ADRs).
Scoring Rubrics (Pages 180-181):
Sub-parameter 10.4.1: Constitution of Pharmacovigilance Committee:
Level-1: If College has constituted Pharmacovigilance Committee as per regulatory required compositions.
Level-1 plus: If Pharmacovigilance Committee meets at least once in each six months.
Level-2 plus: If Pharmacovigilance Committee meets at least once in each 2-3 months.
Level-3 plus: If College is able to show documented evidences of all mentioned like MOMs and ATRs (Minutes of Meetings and Action Taken Reports).
Supporting Documents: Compositions of Committee, MOMs and ATRs for organized meetings.
Sub-parameter 10.4.2: Conducting Patient Education:
Assesses the committee's efforts in patient sensitization and awareness regarding Drug-drug interaction, Drug-food interaction, and adverse drug effects.
Level-1: If Committee has organized minimum one Patient sensitization and awareness Programme.
Level-1 plus: Programme organized in each 6 months.
Level-2 plus: Programme organized in each 2-3 months.
Level-2 plus (second instance, likely typo and should be Level-3 or 4): Programme organized in each month.
Supporting Documents: Records of Programme for Patient Education, Records of conducted Patient Education Programme etc.
Sub-parameter 10.4.3: Research Paper Publications & reporting ADRs:
Level-1: If Committee has published minimum one research paper on ADRs in Indexed Journals.
Level-1 plus: If Committee has reported minimum 1 ADR report to INDIAN PHARMACOPOEIA COMMISSION.
Level-2 plus: If Committee has published more than one research paper in indexed journals in past 2 years.
Level-3 plus: If Committee has reported more than 1 ADR report to INDIAN PHARMACOPOEIA COMMISSION.
Supporting Documents: Evidences of published Research Papers and Reported ADRs etc.
Weightage Calculation (Page 181): Formula aggregates scores from these 3 sub-parameters.
Parameter 10.5: Constitution and Functioning of Antimicrobial Stewardship (AMS) Committee (Weightage: 05; Qualitative)
Operational Explanation (Page 182):
With reference to an advisory issued by NMC in 2021, every Medical College shall have an Antimicrobial Stewardship Committee.
This parameter is framed for capturing information pertaining to measures taken to sensitize and make healthcare professionals aware of the judicious usages of antimicrobials.
Scoring Rubrics (Pages 182-183):
Sub-parameter 10.5.1: Constitution of Pharmacovigilance Committee (AMS Committee):
(Note: Typo in sub-parameter title, should be "Constitution of AMS Committee")
Level-1: If College has constituted AMS Committee as per regulatory required compositions.
Level-1 plus: If AMS Committee meets at least once in each six months.
Level-2 plus: If AMS meets at least once in each 2-3 months.
Level-3 plus: If College is able to show documented evidences of all mentioned like MOMs and ATRs.
Supporting Documents: Compositions of Committee, MOMs and ATRs for organized meetings.
Sub-parameter 10.5.2: Conducting workshops for Health Care Professionals vis-à-vis judicious usages of Antimicrobials & AMR:
Level-1 (<25% trained): If less than 25% of Doctors/Faculties of Clinical Departments & allied health care professionals in attached hospital are trained as per parameter.
Level-1 plus (>50% trained): If more than 50% are trained.
Level-2 plus (>75% trained): If more than 75% are trained.
Level-3 plus (Min 50% Interns trained): If minimum 50% of Interns have been trained as per this parameter.
Supporting Documents: Records of Programme for Patient Education (likely should be "Professional Education"), Records of conducted Patient (Professional) Education Programme etc.
Sub-parameter 10.5.3: Contribution of Microbiology Lab in data-based surveillance of AMR:
Level-1: Whether Microbiology Laboratory has analysed trends of antimicrobials based on Laboratory investigation data.
Level-1 plus: Whether Microbiology Laboratory has evolved any policy related to usages of antimicrobials and AMRs for doctors and allied health care professionals in college.
Level-2 plus: Whether Microbiology Laboratory in attached hospital of Medical College has published minimum one research paper in past 2 years on data-based reporting of visible trends of antimicrobials on human health outcome.
Level-3 plus: Whether contribution of Microbiology Laboratory in data-based reporting has been recognized by any concerned health department or agency of State or Central government.
Weightage Calculation (Page 183): Formula aggregates scores from these 3 sub-parameters.
Summary of Criterion 10:
Criterion 10, "Quality Assurance System," evaluates the formal structures and processes a medical college employs to uphold and enhance quality across its clinical and academic functions. It emphasizes both external validation through accreditations like NABL for labs and NABH for the hospital, and robust internal mechanisms like statutory licensing, and the effective functioning of critical oversight committees (Pharmacovigilance and Antimicrobial Stewardship).
The assessment involves:
Quantitative measures for accreditations and license compliance, where specific achievement levels (e.g., percentage of tests accredited, level of hospital accreditation, percentage of licenses complied with) determine the score.
Qualitative measures for the functioning of committees, focusing on their constitution, frequency and quality of meetings (evidenced by MOMs/ATRs), educational activities, reporting mechanisms, and contributions to safety and best practices.
While having a lower overall weightage (30 points), this criterion is fundamental because a strong QAS underpins the consistent delivery of quality in all other areas assessed by the framework. It ensures that processes are standardized, monitored, and continuously improved, contributing to patient safety, reliable diagnostic services, and responsible medication use, all of which are integral to a high-quality medical education and healthcare environment.
This final criterion, carrying a weightage of 40 points, focuses on an often-overlooked yet crucial aspect of institutional quality: the experiences and perceptions of those directly involved with or served by the medical college. It aims to capture systematic feedback from students, faculty, alumni, and patients to understand their views on the quality of education, facilities, support systems, and healthcare services. All parameters under this criterion are designated as quantitative, likely meaning that qualitative feedback will be collected and then aggregated or scored to produce quantitative metrics.
Overall Operational Explanation (as per Page 10 and specific parameter explanations):
Core Focus: To capture feedback and information about the perception of various stakeholders regarding the quality of Medical Colleges. This includes:
Student Perspective: Their direct experience with facilities, training, and their overall inspiration derived from the MBBS program.
Faculty Perspective: Their experiences implementing the curriculum, satisfaction with work conditions, financial entitlements, and benefits.
Alumni Perspective: How their past training at the institution has influenced their professional lives and their overall experience of the college's quality.
Patient Perspective: Their perception of the healthcare services received at the attached hospital.
Rationale:
It's crucial to understand facilities from the students' perspective, as they are direct users and can provide relevant information about whether resources contribute to their training as intended by the Medical Regulator.
Alumni feedback is important to assess the long-term impact of the institution's training.
Faculty feedback provides insights into the implementation of the curriculum and work environment.
Alignment with NMC Assessment Criteria: This criterion is associated with the assessment and rating criteria set forth by the NMC, specifically related to "Students' Feedback," "Academic Excellence," and "Teaching-Learning Environment" categories.
Data Collection: Feedback and satisfaction surveys will be conducted. For students, online feedback/responses will be collected.
Detailed Breakdown of Parameters under Criterion 11:
Parameter 11.1: Feedback from sampled students & Inspiration Index (Weightage: 10; Quantitative)
Operational Explanation (Page 184):
Online Feedback/Responses from students (minimum 50% of total enrolled students) of First Professional, Second Professional, Third Professional, and Interns will be collected.
College will be required to upload Professional Year wise information: "Name of students," "their Email Ids," and whether they are "living in Hostel provided by College or not."
Feedback will be based on two dimensions:
*Dimension-1: Student Feedback vis-à-vis Teaching Learning Opportunities & Facilities (Online captured from First & Second Professional students).
**Dimension-2: Impact of MBBS Program on Inspiration of Students (Online captured from Third Professional & Interns).
Dimensions for Feedback (Page 184-185):
Dimension-1 (Teaching Learning Opportunities & Facilities - for 1st & 2nd Prof):
Clinical Postings & Exposure
Skill Laboratory/Simulation Lab
Support of Faculty for any difficulty in Learning
Indoor & Outdoor Sports facilities
Hygiene & Sanitation (College, Hostel & Teaching Hospital)
Quality drinking water & Canteen/Mess Facilities
Hostel Facilities
Central Library Facilities
Safety Measures in Campus
Dimension-2 (Impact of MBBS Program on Inspiration - for 3rd Prof & Interns):
Perceived Quality of Faculty
Perceived Career Support Facilities & Programs
Perceived Motivation Level of Faculty
Perceived Motivation level of Medical Students
Perceived Readiness of UG Student (skills & competencies for medical career)
Recommending medical field as career to relatives/known one/other aspirants.
Scoring Methodology (Page 185):
Rating Scale: For Dimension-1, each sub-parameter rated on a 4-point scale. For Dimension-2, each sub-parameter rated on a 4-point scale.
Computation of Dimension-wise Average Score:
Dimension wise average rating score per student = (Average rating score per student given to College / Maximum overall possible score on dimension) * 100
Scoring Rubrics (Page 186):
Sub-parameter 11.1.1: Student rating score on Dimension-1 (First & Second Professional students):
Level-1 to Level-4: Based on the average score per student being ≤25%, >25% to ≤50%, >50% to ≤75%, or ≥75% of the maximum possible score on this dimension.
Sub-parameter 11.1.2: Student & Intern rating score on Dimension-2 (Third Professional and interns):
Level-1 to Level-4: Similar scoring based on the average score for Dimension-2.
Supporting Documents: Minimum 50% of students of each Professional must submit their responses (implying the college needs to ensure this participation for data collection).
Weightage Calculation (Page 186): Formula aggregates scores from these 2 sub-parameters, multiplied by AW of 10.
Parameter 11.2: Feedback from sampled Faculty & Loyalty Index (Weightage: 10; Quantitative)
Operational Explanation (Page 186):
Feedback from all Faculty will be collected online.
College required to upload Faculty names & their Email Ids.
Feedback Parameters (Teacher Feedback vis-à-vis Work Conditions & Environment & Professional Aspiration - Page 187): A list of 16 sub-parameters for faculty to respond to online:
Salary structures/Financial Entitlements
Promotion & Increments
Opportunity for Career Advancement
Faculty welfare and amenities (Privileges, Insurance, Health Check Ups)
Residential Facility /Accommodations for Staff
Workload
Recognition and Importance
Empathy of Management towards Faculty
Reward and Recognition for best performing tasks
Fair allocation & allotments of Responsibilities
Exposure to advanced Health Education Technology
Motivating Work Environment
Perceived reputation of the College
Academic Freedom & Flexibility
Recommending College among Medical Education aspirants for admission
Recommending College for Jobs
Scoring Methodology (Page 187):
Computation of Average Score per Faculty:
Score of the College based on Feedback from Faculty = (Average rating score per Faculty given to College / Maximum overall possible score) * 100
Scoring Rubrics (Page 187):
11.2- Rating based on Teaching Faculty Feedback:
Level-1 to Level-4: Based on the average score per teaching faculty being ≤25%, >25% to ≤50%, >50% to ≤75%, or ≥75% of the maximum score possible on this dimension.
Supporting Documents: Minimum 50% of all full Faculties are essential (implying their participation in feedback).
Loyalty Index Calculation (Page 188):
Loyalty Index = ((Score on 5.5 Parameter / 4) + (Score on 11.2 Parameter / 4)) / 2 * Total weightage assigned to Parameters 5.5 & 11.2
(Note: Parameter 5.5 is "Staff attrition rate". This formula attempts to combine faculty satisfaction/feedback (11.2) with faculty retention (5.5) to create a "Loyalty Index". The "Total weightage assigned..." part seems unusual for calculating an index that is then likely used for scoring; typically, an index is a standalone value. The note "Loyalty Index Value will range on the scale of 0 to 25" is important.)
Parameter 11.3: Perception of Alumni towards quality of Institution (Weightage: 10; Quantitative)
Operational Explanation (Page 188):
Feedback from a minimum of 50 alumni representing oldest and youngest batches collected online.
College to upload Alumnus wise Names and their Email Ids for all batches.
Feedback Parameters (Page 188):
Establishing Network of Alumni for Professional Interaction
Organization of Alumni Meet
Awards & Recognition for Best Performing Alumni
Perceived Reputation of College among Medical Education aspirants
Perceived reputation of college among employers
Involvement of College at national & state level policy & decision making
Perception of society towards alumni of this College
Scoring Methodology (Page 188):
Computation of Average Score per Alumni:
Score of the College based on Feedback from Alumni = (Average rating score per Alumni given to College / Maximum overall possible score) * 100
Scoring Rubrics (Page 189):
11.3- Rating based on Alumni Feedback:
Level-1 to Level-4: Based on the average score per alumni faculty (likely "alumni participant") being ≤25%, >25% to ≤50%, >50% to ≤75%, or ≥75% of the maximum score possible on this dimension.
Supporting Documents: Minimum 50 Alumni are required to submit their Feedback.
Parameter 11.4: Perception of Patients towards Health Care Services (Weightage: 10; Quantitative)
Operational Explanation (Page 189):
Feedback from randomly selected minimum 2 OPD patients and 2 IPD Patients collected on day of onsite visit from each clinical major department (Medicine & allied, Surgery & allied, Obstetrics & Gynaecology).
Feedback Parameters (Page 189):
Quality of Health Care Professional (Faculty of Clinical Departments, Residents)
Attitude of Hospital Staff towards patients
Measures for Hygiene and sanitation in Hospital
Quality of Hospital Facility
Scoring Methodology (Page 189):
Computation of Average Score per Patient:
Score of the College based on Feedback from Patients = (Average rating score per Patient given to College / Maximum overall possible score) * 100
Scoring Rubrics (Page 190):
11.3- Rating based on Patient Feedback (Typo in numbering, should be 11.4):
Level-1 to Level-4: Based on the average score per alumni faculty (likely "patient") being ≤25%, >25% to ≤50%, >50% to ≤75%, or ≥75% of the maximum score possible on this dimension.
Supporting Documents: Minimum 10 to 15 Patients are required to submit their Feedback (this seems to be a general number, not per department as stated in operational explanation, needs clarification).
Summary of Criterion 11:
Criterion 11, "Feedback & Perception of Stakeholders," provides a 360-degree view of the medical college's performance and environment by systematically gathering and quantifying the experiences and opinions of its key constituents: students, faculty, alumni, and patients.
Key aspects include:
Structured Online Feedback: For students, faculty, and alumni, detailed online questionnaires covering various dimensions from academic opportunities and facilities to work environment, career impact, and institutional reputation.
Minimum Participation Thresholds: Requirements for minimum participation (e.g., 50% of students/faculty, 50 alumni) to ensure representative data.
Onsite Patient Feedback: Direct collection of patient perceptions during site visits.
Quantitative Scoring: Feedback across multiple sub-parameters is aggregated into an average score for each stakeholder group, which is then converted into a performance level (1-4).
Indices: Introduction of concepts like "Inspiration Index" for students and a "Loyalty Index" for faculty (combining feedback with attrition data from Criterion 5).
This criterion acknowledges that objective data on infrastructure and processes (covered in other criteria) should be complemented by subjective experiences. Positive stakeholder perception can indicate a healthy institutional culture, effective support systems, and a good reputation, all of which contribute to the overall quality and success of a medical college. The use of normalized scores and defined levels helps in standardizing these inherently subjective inputs for comparative assessment.