Milestone-based outcome oriented training is now an important framework for residency education and program accreditation. Analysing 18 months of Orthopaedic Surgery Patient Care Milestone real-time evaluations via a web platform in a single residency program demonstrated significant variability in the rate of assessment and competency level among Milestones. In 614 evaluations, there was a strong, positive linear relationship between postgraduate year and competency level. Chief residents achieved an average competency level of 4.0, the graduation target, as assessed by faculty in real-time. These data may inform ongoing discussions about potential revisions to the Orthopaedic Surgery Milestones, and highlight one potential model for improving resident feedback. The Accreditation Council for Graduate Medical Education (ACGME) now requires the biannual submission of a variety of Milestones by United States residency programs, as part of a move towards competency-based medical training. Our program developed a web-based platform to collect Milestone-based evaluations in real-time, in an effort to improve feedback and facilitate ACGME compliance. After 18 months of use, we assessed how frequently each Milestone is evaluated in real-time, as well as the distribution of competency levels by each Patient Care Milestone and postgraduate year (PGY). These results may inform on relative strengths and weaknesses of a program, or of particular Milestones. At a single academic orthopaedic residency program with 40 residents in total, the use of a web-based trainee-driven evaluation tool (eMTRCS – electronic Milestone Tracking and Competency System) was initiated in 2014. Residents initiate evaluation in real-time, triggering a digital Milestone-based evaluation by a particular faculty member. De-identified data from January 2014 to December 2015 was abstracted. Descriptive statistics on the distribution of evaluations submissions, type of Milestone, faculty evaluation levels, and resident PGY were calculated. As the data was ordinal with evidence of non-normality, nonparametric tests were utilised to analyse differences in the distribution, and assess correlation between planned outcome variables. A total of 614 evaluations were included in the analysis, for an average of 38.4 evaluations per Patient Care Milestone. There was a wide variability in the number of evaluations per Milestone, ranging from only four “Diabetic Foot” submissions to 75 submissions on “Hip and Knee Arthritis” (Figure 1). Faculty-scored competency also varied significantly among the Milestones (Figure 2, p = 0.009 by Kruskal-Wallis rank sum test). Higher levels of competency were seen as resident PGY progressed (mean = 2.1, 2.4, 3.1, 3.7, 4.0 for PGY1–5 respectively, p<0.001). Through 18 months of use and 614 real-time evaluations, a web-based system for assessing Milestone levels showed significant variability in the number of assessments and competency level among the Orthopaedic Surgery Patient Care Milestones. There are multiple possible explanations, ranging from resident and faculty confusion about the Milestones to a lack of clinical volume in specific areas. In contrast to the inter-Milestone variability in assessments and competency levels, the strong stepwise relationship between advancing PGY and increasing levels of competency does provide evidence of validity for Milestone-based evaluations. Graduating residents in this program achieved, on average, the graduation target competency level as assessed by faculty in real-time.