District 303 administrators presented a package of changes to the district’s Student Achievement Report on Monday, Sept. 22, asking the Learning & Teaching Committee for feedback on cohort tracking, deeper disaggregation and a new “Opportunity Scale” that maps ACT score ranges to postsecondary and career pathways. The administration said it plans to return a fuller Student Achievement Report (SAR) to the committee in November for board review.
The proposal centers on three reporting shifts: tracking cohorts over time rather than relying solely on year‑to‑year cross‑sectional comparisons; comparing district cohorts to weighted benchmark and aspirational districts and the state; and adding a color‑coded “Opportunity Scale” that connects ACT score bands with the college and career opportunities historically reached by District 303 graduates. “We are not basing our goals or performance targets on the state,” the presenter said, “but we’re planning on using some state performance levels to give us clues.”
Administrators said the state has changed unified performance levels for assessments (IAR, ACT and the Illinois science assessment) and that the changes produce a non‑linear “wave” across grade levels — meaning proficiency is not equally difficult from grade to grade. The presenter used national/state visuals to illustrate that the relative difficulty of meeting proficiency can shift between grades and between math and ELA, and said the district will use that wave as a predictive lens when interpreting its own D303 cohort data rather than treating the state benchmarks as district targets.
Why it matters: committee members and administrators said the proposed report is intended to make data more actionable for principals and teachers, not just a one‑night board presentation. Administration described a plan for a district data team that would disaggregate state assessment data for each building, feed it into the building continuous improvement process and deliver building‑level analyses for principals and teacher teams. “This becomes our primary visual display,” the presenter said, describing a single set of visuals that can be filtered by subgroup, school or cohort.
Key elements presented
- Cohort tracking and benchmarking: Staff showed sample graphs comparing a D303 cohort line to a weighted benchmark district average, an aspirational district average and the state line. Administrators said they will monitor cohort slope changes and investigate causes (changes in proficiency definitions, curriculum implementations, program rollouts or other factors).
- Disaggregation options: The SAR would permit drilldowns by subgroup (English learners, special education, multilingual learners, length of enrollment, AP participants and other groups). Administrators noted one limitation: other districts’ multilingual data are not always available by language; the district can compare its multilingual students on IAR/ACT but not language‑specific ACCESS scores for other districts.
- Opportunity Scale: Using Clearinghouse data about where D303 graduates enroll and the ACT score bands that correlate with those enrollments, staff proposed a color‑coded scale that maps score ranges (for example an ACT ELA range) to the demonstrated skills and the postsecondary opportunities those scores historically align with. The presenter described that the scale would be used to create classroom‑level meaning (which skills to teach and when) and to provide students with individualized, actionable feedback from fall pre‑ACT testing.
- Postsecondary and career linkage: Staff described work to align ACT/ASVAB/industry skill indicators to career pathways (e.g., trade apprenticeships) so students and counselors can see concrete next steps; administrators cautioned that clearinghouse data cover colleges and universities but not yet trades or military outcomes, so some career pathway information would need to be curated.
- Proposed academic goals to discuss: (1) each student meeting expected growth on the state assessment (the proposal used the state student growth percentile; the 50th percentile represents expected/average growth), (2) widening the district’s gap over the state in ELA and math, (3) each student participating in at least one postsecondary academic or work‑based experience before graduation, and (4) increasing students’ sense of belonging (the district will explore combining attendance/engagement metrics with existing survey tools).
Examples and clarifications offered in the meeting
- Administrators showed a concrete student‑level example: a student scored 771 on a 4th‑grade IAR ELA administration and 782 in 5th grade; that student’s growth percentile was cited as 81% (meaning greater growth than 81% of same‑score peers). The presenter said that growth percentiles are reported on the state score report and can be used to set individualized targets.
- State methodology: presenters repeatedly noted they were summarizing the state’s changes rather than defending them. They said Illinois’s previous proficiency definitions were more stringent than most states and that the state’s recent work attempts to align performance levels (IAR/ACT/ISA) and map them to the ACT/college readiness metric.
- Data limitations: presenters emphasized some slides were illustrative or used national samples, not finalized D303 numbers. They repeatedly warned committee members that several sample charts were fictitious placeholders meant to show format and functionality, not the final data.
Questions, concerns and next steps raised by committee members
- Feasibility of cross‑district growth comparisons: Board members asked whether student growth percentiles can be computed against aspirational districts; staff said they can compute growth percentiles only for district students (they said the district has obtained historical score files for its students) and that comparable individual‑level growth percentiles are not available for other districts. Administration offered to investigate further and report back.
- Scope and communication: Several board members asked how the new visuals and individualized Opportunity Scale would be pushed to principals, teachers, counselors and students. Administration said a district data team would produce building‑level disaggregations and that pre‑ACT fall testing will generate individualized improvement plans for students and teachers.
- Equity and messaging risks: Board members cautioned that any scale or pathway map should not be read as limiting students’ futures. Administrators said the Opportunity Scale would be presented as information and direction, not as a ceiling on student potential, and noted the clearinghouse currently captures college outcomes but not trades/military outcomes.
Discussion vs. decisions
- Discussion only: All substantive items — cohort graphs, Opportunity Scale, potential academic goals, and disaggregation approaches — were presented for feedback and discussion; no new district policy or targets were formally adopted at the meeting.
- Direction/assignments: Administrators will refine the SAR and return to the Learning & Teaching Committee in November. Staff agreed to explore whether growth percentiles or other individualized comparisons can be used against aspirational districts and to clarify which comparison data are available. Committee co‑chairs asked board members to submit feedback to help prioritize which data views should appear in the November SAR.
- Formal action: none on academic goals or SAR content; the committee accepted the presentation and asked staff to continue development.
Ending: The committee thanked administration for the presentation and asked the public and board members to forward written feedback. The administration will bring a revised, board‑ready Student Achievement Report and recommendations for academic goals to a future Learning & Teaching meeting (target: November).