Central to the assessment framework is professional judgement. Assessors are responsible and accountable for judgements about trainee performance, leading to structured formative feedback to trainees. Trainees’ reflection on feedback is also a necessary component of all assessments.
The programme of assessment is described in each curriculum. It comprises an integrated framework of examinations, assessments in the workplace and judgements made about learners during their approved programme of training. Its purpose is to robustly evidence, ensure and clearly communicate the expected levels of performance at critical progression points in, and to demonstrate satisfactory completion of, training as required by the curriculum. All the assessments in the curriculum are designed to include a feedback element as well as to identify concerns in multiple ways, particularly:
- Learning agreement meetings
- Workplace-based assessments covering knowledge, clinical judgement, technical skills and professional behaviour and attitudes in conjunction with the surgical logbook of procedures to support the assessment of operative skills
- Examinations held at key stages - during the early years of training and towards the end of specialist training
- An annual review of competence progression (ARCP)
Assessment blueprint
Assessment framework
Learning Agreement
The learning agreement is a formal process of goal setting and review meetings that underpin training and is formulated through discussion. The process ensures adequate supervision during training provides continuity between different placements and supervisors and is one of the main ways of providing feedback to trainees. There are three learning agreement meetings in each placement between the trainee and Assigned Educational Supervisor (AES) and these are recorded in the trainee’s learning portfolio.
Multiple Consultant Report
Surgical training is outcomes based, ensuring trainees will be able to finish training when they are judged to have reached the standard for certification (the level expected of a day-one consultant in their specialty). The assessment of the Capabilities in Practice (CiPs) and Generic Professional Capabilities (GPCs) – the high-level outcomes of the curriculum – is through the Multiple Consultant Report (MCR). It involves the global professional judgement about a trainee’s suitability to take on particular responsibilities or tasks that are essential to consultant practice.
Multiple Consultant Report / Trainee self-assessment
Workplace-based Assessment
Effective feedback is known to enhance learning, and combining self-reflection with feedback promotes deeper learning. WBAs are primarily aimed at aiding learning through constructive feedback that identifies areas for development. They provide trainees with educational feedback from skilled clinicians that should result in reflection on practice and an improvement in the quality of care. WBAs are only mandatory for the assessment of the critical conditions and index procedures. They may also be useful to evidence progress in targeted training where this is required e.g. for any areas of concern. They are collated in the trainee’s learning portfolio and are regularly reviewed during each placement, providing evidence that informs the judgement of the AES reports for the ARCP.
Workplace-based assessment forms and guidance
Examinations
The Intercollegiate Membership examination of the Royal Colleges of Surgeons (MRCS), MRCS(ENT) or Diploma in Otolaryngology – Head and Neck Surgery (DO-HNS) examination) is a required assessment of core surgical training before the award of an ARCP outcome 6. The examination components have been chosen to test the application of knowledge, clinical skills, interpretation of findings, clinical judgement, decision-making, professionalism, and communication skills described within the curriculum.
The Intercollegiate Specialty Board (ISB) examination is normally taken towards the end or after successful completion of phase 2. The standard is set at having the knowledge, clinical and professional skills at the level of a day-one consultant in the generality of the specialty, and must be passed in order to complete the curriculum. The examination components are chosen to test the application of knowledge, clinical skills, interpretation of findings, clinical judgement, decision making, professionalism, and communication skills described within the curriculum.
Annual Review of Competence Progression (ARCP)
The ARCP is a formal Deanery/HEE Local Office process overseen and led by the TPD. It scrutinises the trainee’s suitability to progress through the training programme. It bases its decisions on the evidence that has been gathered in the trainee’s learning portfolio during the period between ARCP reviews, particularly the AES report in each training placement. The ARCP is normally undertaken on an annual basis for all trainees in surgical training. A panel may be convened more frequently for an interim review or to deal with progression issues (either accelerated or delayed) outside the normal schedule. The ARCP panel makes the final summative decision that determines whether trainees are making appropriate progress to be able to move to the next level or phase of training or to achieve certification.
Workplace-based assessment forms and guidance
WBAs are formative (assessments for learning) and may be used to assess and provide feedback on all clinical activity. Trainees can use any of the assessments shown below to gather feedback or provide evidence of their progression in a particular area. WBAs are only mandatory for the assessment of the critical conditions and index procedures (see the specialty-specific requirements in appendices 3 and 4 of each specialty curriculum). They may also be useful to evidence progress in targeted training where this is required e.g. for any areas of concern.
Each individual WBA is designed to assess a range of important aspects of performance in different training situations. Taken together the WBAs can assess the breadth of knowledge, skills and performance described in the curriculum. WBAs use different trainers’ direct observations of trainees to assess the actual performance of trainees as they manage different clinical situations in different clinical settings and provide more granular formative assessment in the crucial areas of the curriculum. Trainees undertake each task according to their training phase and ability level and the assessor must intervene if patient safety is at risk. It would be normal for trainees to have some assessments which identify areas for development because their performance is not yet at the standard for the completion of that training.
The WBA methodology is designed to meet the following criteria:
-
Validity – to ensure face validity, WBAs comprise direct observations of workplace tasks. The complexity of the tasks increases in line with progression through the training programme. To ensure content validity all the assessment instruments have been blueprinted against the CiPs and GPCs
-
Reliability – multiple measures of performance using different assessors in different training situations produce a consistent picture of performance over time
-
Feasibility – methods are designed to be practical by fitting into the training and working environment
-
Cost-effectiveness – the only significant additional costs should be in the training of trainers and the time investment needed for feedback and regular appraisal, this should be factored into trainer job plans
-
Opportunities for feedback – structured feedback is a fundamental component of all WBAs
-
Impact on learning – The WBAs are all designed to include immediate educational feedback and should lead to trainees’ reflections on practice in order to address learning needs. The assessment process thus has a continuous developmental impact on learning.
Each WBA is recorded on a structured form to help assessors distinguish between levels of performance and prompt areas for their verbal developmental feedback to trainees immediately after the observation. Each WBA includes the trainee’s and assessor’s individual comments, ratings of individual competencies (e.g. Satisfactory, Needs Development or Outstanding) and global rating (using anchor statements mapped to phases of training). Rating scales support the drive towards excellence in practice, enabling learners to be recognised for achievements above the level expected for a level or phase of training. They may also be used to target areas of under-performance. As they accumulate, the WBAs for the critical conditions and index procedures also contribute to the AES report for the ARCP.
WBAs for index procedures and critical conditions will inform the AES report along with a range of other evidence to aid the decision about the trainee’s progress. All trainees are required to use WBAs to evidence that they have achieved the learning in the index procedures or critical conditions by certification. However, it is recognised that trainees will develop at different rates, and failure to attain a specific level at a given point will not necessarily prevent progression if other evidence shows satisfactory progress.
The assessment blueprint indicates how the assessment programme provides coverage of the CiPs, the GPC framework and the syllabus.
Mandatory WBAs
Critical Conditions
Index Procedures
Multisource Feedback
Non-mandatory WBAs
Multiple Consultant Report / Trainee self-assessment
The MCR assessment is carried out by the consultant Clinical Supervisors (CSs) involved with a trainee, with the AES contributing as necessary to some domains (and particularly to GPC domains 6-9). The number of CSs taking part reflects the size of the specialty unit and is expected to be no fewer than two. The exercise reflects what many consultant trainers do regularly as part of a faculty group.
The MCR includes a global rating in order to indicate how the trainee is progressing in each of the CiPs. This global rating is expressed as a recommendation:
Supervision levels
Level I: Able to observe only
Level II: Able and trusted to act with direct supervision:
a. Supervisor present throughout
b. Supervisor present for part
Level III: Able and trusted to act with indirect supervision
Level IV: Able and trusted to act at the level expected of a day-one consultant
Level V: Able and trusted to act at a level beyond that expected of a day-one consultant
In core surgical training (CT1/CT2) and the early years of run-through training (ST1/ST2) the highest level equates to the role of the phase 2 surgical trainee. Supervision level I is divided into passive and active observation and level II categorises where some or most of the capability is conducted under direct supervision and where it is performed completely under direct supervision. In specialty training, levels IV and V equate to the level required for certification and the level of practice expected of a day-one consultant in the Health Service.
Supervision levels are behaviourally anchored ordinal scales based on progression to competence and reflect a judgment that has clinical meaning for assessors. Using the scale, CSs must make an overall, holistic judgement of a trainee’s performance on each CiP. Levels IV and V equate to the level required for certification and the level of practice expected of a day-one consultant in the Health Service.
The MCR uses the principle of highlight reporting, where CSs do not need to comment on every descriptor within each CiP but use them to highlight areas that are above or below the expected level of performance. The MCR can describe areas where trainees might need to focus development or areas of particular excellence. Feedback must be given for any CiP that is not rated as level IV and in any GPC domain where development is required. Feedback must be given to the trainee in person after each MCR and, therefore, includes a specific feedback meeting with the trainee using the highlighted descriptors within the MCR and/or free text comments.
The MCR feeds into the learning agreement meeting. At the mid-point it allows goals to be agreed for the second half of the placement, with an opportunity to specifically address areas where development is required. Towards the end of the placement, it helps to inform the AES report which in turn feeds into the ARCP. The ARCP uses all presented evidence to make the definitive decision on progression. The final formative MCR also feeds into the learning agreement of the next placement to facilitate discussion between the trainee and the next AES.
Trainee self-assessment
Trainees should complete the self-assessment of their performance against the GPCs and CiPs in the same way as CSs complete the MCR, using the same form and describing self-identified areas for development with free text or CiP/GPC or GPC descriptors. Reflection to gain insight on performance is an important development tool and self-recognition of the level of supervision needed at any point in training enhances patient safety. Self-assessments are part of the evidence reviewed when meeting the AES at the mid-point and end of a placement. Wide discrepancy between the self-assessment and the MCR allows identification of over or under confidence and for support to be given accordingly.
Capabilities in Practie (CiPs)
Generic Professional Capabilities (GPCs)
Supervision Levels
MCR form
Guidance for Clinical Supervisors
Guidance for trainees
Trial MCR
Case Based Discussion (CBD)
The CBD assesses the performance of trainees in their management of a patient case to provide an indication of competence in areas such as clinical judgement, decision-making and application of medical knowledge in relation to patient care. The CBD process is a structured, in-depth discussion between the trainee and a consultant supervisor. The method is particularly designed to test higher order thinking and synthesis as it allows the assessor to explore deeper understanding of how trainees compile, prioritise and apply knowledge. By using clinical cases that offer a challenge to trainees, rather than routine cases, trainees are able to explain the complexities involved and the reasoning behind choices they made. It also enables the discussion of the ethical and legal framework of practice. It uses patient records as the basis for dialogue, for systematic assessment and structured feedback. As the actual record is the focus for the discussion, the assessor can also evaluate the quality of record keeping and the presentation of cases. The CBD is important for assessing the critical conditions (see appendix 3 of the relevant curriculum). Trainees are assessed against the standard for the completion of their phase of training.
CBD Form
CBD Guidance
Reflective CBD Form
Reflective CBD Guidance
Tips for using CBD
Clinical Evaluation Exercise (CEX) / CEX for Consent (CEX(C))
The CEX or CEX(C) assesses a clinical encounter with a patient to provide an indication of competence in skills essential for good clinical care such as communication, history taking, examination and clinical reasoning. These can be used at any time and in any setting when there is a trainee and patient interaction and an assessor is available. The CEX or CEX(C) is important for assessing the critical conditions (see appendix 3 of the relevant curriculum). Trainees are assessed against the standard for the completion of their phase of training.
CEX Form
CEXC Form
CEX/C Guidance
Tips for using CEX/C
Direct Observation of Procedural Skills (DOPS)
The DOPS assesses the trainee’s technical, operative and professional skills in a range of basic diagnostic and interventional procedures during routine surgical practice in wards, out-patient clinics and operating theatres. The procedures reflect the common and important procedures. Trainees are assessed against the standard for the completion of core surgical training.
Endoscopy
Trainees who develop endoscopy skills will record their experience through the Joint Advisory Group (JAG) Endoscopy Training System (JETS). This is a system common across all medical and surgical specialties. Trainees are required to keep a log of all endoscopic procedures they have undertaken including the level of supervision required on each occasion. The JETS logbook demonstrates breadth of experience in endoscopy and trainees will perform DOPs within this framework which will be available for review and will feed in to the ARCP process.
DOPS Form
DOPS Guidance
Tips for using DOPS
Specialty Specific DOPS forms
Multi-source Feedback (MSF)
The MSF assesses professional competence within a teamworking environment. It comprises a trainee self-assessment and assessments of the trainee’s performance from a range of colleagues covering different grades and environments (e.g. ward, theatre, out-patients) including the trainee’s AES. The competencies map to the standards of Good Medical Practice (GMP) and enable serious concerns, such as those about a trainee’s probity and health, to be highlighted in confidence to the AES, enabling appropriate action to be taken. Feedback is in the form of a peer assessment chart, enabling comparison of the self-assessment with the collated views received from the team and includes their anonymised but verbatim written comments. The AES should meet with the trainee to discuss the feedback on performance in the MSF. Trainees are assessed against the standard for the completion of their training level.
MSF trainee self-assessment form
MSF rater form
MSF guidance
Tips for using MSF
Procedure Based Assessment (PBA)
The PBA assesses advanced technical, operative and professional skills in a range of specialty procedures or parts of procedures during routine surgical practice in which trainees are usually scrubbed in theatre. The assessment covers pre-operative planning and preparation; exposure and closure; intra-operative elements specific to each procedure and post-operative management. The procedures reflect the routine or index procedures relevant to the specialty. The PBA is used particularly to assess the index procedures (see appendix 4 of the relevant curriculum). Trainees are assessed against the standard for certification.
PBA Guidance
Trainer validation worksheet
Tips for using PBA
PBA Assessment Forms (PDF Tool)
Surgical logbook
The surgical logbook is tailored to each specialty and allows the trainee’s competence as assessed by the DOPS and PBA to be placed in context. It is not a formal assessment in its own right, but trainees are required to keep a log of all operative procedures they have undertaken including the level of supervision required on each occasion using the key below. The logbook demonstrates breadth of experience which can be compared with procedural competence using the DOPS and the PBA and will be compared with the indicative numbers of index procedures defined in the curriculum (see appendix 4 of the relevant curriculum).
Observed (O)
Assisted (A)
Supervised - trainer scrubbed (S-TS)
Supervised - trainer unscrubbed (S-TU)
Performed (P)
Training more junior trainee (T)
The following workplace-based assessments may also be used to further collect evidence of achievement, particularly in the GPC domains of 'Quality improvement, Education and training and Leadership and team working':
Assessment of Audit (AoA)
The AoA reviews a trainee’s competence in completing an audit or quality improvement project. It can be based on documentation or a presentation of a project. Trainees are assessed against the standard for the completion of their phase of training.
AoA Form
AoA Guidance
Observation of Teaching (OoT)
The OoT assesses the trainee’s ability to provide formal teaching. It can be based on any instance of formalised teaching by the trainee which has been observed by the assessor. Trainees are assessed against the standard for the completion of their phase of training.
OoT Form
OoT Guidance