Background: Decisions about performance in programs of assessment that provide an array of assessment evidence require judgments about the quality of different pieces of assessment data to determine which combination of data points best represent a trainee's overall performance. Aim: In this article, we examine the nature of evidence selected by first-year medical students to include in a portfolio used to make promotion decisions. Methods: We reviewed portfolios to examine the number, type, and source of assessments selected by students (n=32) to document their performance in seven competencies. The quality of assessment data selected for each competency was rated by promotion committee members (n=14). Results: Findings indicate that students cited multiple types and sources of available assessments. The promotion committee rated evidence quality highest for competencies where the program provided sufficient evidence for students to cite a broad range of assessments. When assessments were not provided by the program, students cited self-generated evidence. Conclusion: We found that when student-constructed portfolios are part of an overall assessment system, students generally select evidence in proportion to the number and types of assessments available.