↓ Skip to main content

Profiling postgraduate workplace-based assessment implementation in Ireland: a retrospective cohort study

Overview of attention for article published in SpringerPlus, February 2016
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
1 X user

Citations

dimensions_citation
8 Dimensions

Readers on

mendeley
46 Mendeley
Title
Profiling postgraduate workplace-based assessment implementation in Ireland: a retrospective cohort study
Published in
SpringerPlus, February 2016
DOI 10.1186/s40064-016-1748-x
Pubmed ID
Authors

Aileen Barrett, Rose Galvin, Yvonne Steinert, Albert Scherpbier, Ann O’Shaughnessy, Gillian Walsh, Mary Horgan

Abstract

In 2010, workplace-based assessment (WBA) was formally integrated as a method of formative trainee assessment into 29 basic and higher specialist medical training (BST/HST) programmes in six postgraduate training bodies in Ireland. The aim of this study is to explore how WBA is being implemented and to examine if WBA is being used formatively as originally intended. A retrospective cohort study was conducted and approved by the institution's Research Ethics Committee. A profile of WBA requirements was obtained from 29 training programme curricula. A data extraction tool was developed to extract anonymous data, including written feedback and timing of assessments, from Year 1 and 2 trainee ePortfolios in 2012-2013. Data were independently quality assessed and compared to the reference standard number of assessments mandated annually where relevant. All 29 training programmes mandated the inclusion of at least one case-based discussion (max = 5; range 1-5). All except two non-clinical programmes (93 %) required at least two mini-Clinical Evaluation Exercise assessments per year and Direct Observation of Procedural Skills assessments were mandated in 27 training programmes over the course of the programme. WBA data were extracted from 50 % of randomly selected BST ePortfolios in four programmes (n = 142) and 70 % of HST ePortfolios (n = 115) in 21 programmes registered for 2012-2013. Four programmes did not have an eligible trainee for that academic year. In total, 1142 WBAs were analysed. A total of 164 trainees (63.8 %) had completed at least one WBA. The average number of WBAs completed by HST trainees was 7.75 (SD 5.8; 95 % CI 6.5-8.9; range 1-34). BST trainees completed an average of 6.1 assessments (SD 9.3; 95 % CI 4.01-8.19; range 1-76). Feedback-of varied length and quality-was provided on 44.9 % of assessments. The majority of WBAs were completed in the second half of the year. There is significant heterogeneity with respect to the frequency and quality of feedback provided during WBAs. The completion of WBAs later in the year may limit available time for feedback, performance improvement and re-evaluation. This study sets the scene for further work to explore the value of formative assessment in postgraduate medical education.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 46 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 46 100%

Demographic breakdown

Readers by professional status Count As %
Professor 6 13%
Researcher 6 13%
Student > Ph. D. Student 6 13%
Student > Master 5 11%
Student > Bachelor 4 9%
Other 12 26%
Unknown 7 15%
Readers by discipline Count As %
Medicine and Dentistry 23 50%
Nursing and Health Professions 5 11%
Social Sciences 5 11%
Psychology 4 9%
Agricultural and Biological Sciences 1 2%
Other 2 4%
Unknown 6 13%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 17 January 2018.
All research outputs
#15,368,104
of 22,862,742 outputs
Outputs from SpringerPlus
#932
of 1,849 outputs
Outputs of similar age
#176,475
of 297,889 outputs
Outputs of similar age from SpringerPlus
#74
of 153 outputs
Altmetric has tracked 22,862,742 research outputs across all sources so far. This one is in the 22nd percentile – i.e., 22% of other outputs scored the same or lower than it.
So far Altmetric has tracked 1,849 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 5.7. This one is in the 35th percentile – i.e., 35% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 297,889 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 32nd percentile – i.e., 32% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 153 others from the same source and published within six weeks on either side of this one. This one is in the 37th percentile – i.e., 37% of its contemporaries scored the same or lower than it.