The usual practice with the SAT is to slip experimental questions into real SATs. The unvalidated new questions don't count for the students' scores, but ETS checks to make sure they aren't worse than the old questions. I don't see much evidence that this has been done with Coleman's reforms yet.
Ready Aim Fire or Fire Ready Aim?
A fuller review of the evidentiary backbone for the redesigned SAT will be available on April 16, 2014. ...
Many institutions have devoted considerable resources to developing the skills of source analysis and evidence use in their students.
Please refer to research by these leading universities:
But these links don't go to psychometric studies, just to pages of general advice for undergraduates on how to write college papers.
Work has begun to build strong evidence for validity by testing item types, exam questions, essay prompts, and test forms over time. The College Board will review student performance metrics to ensure that exam questions and test forms are measuring the knowledge and skills they are intended to measure.
Pilot Predictive Validity Study
We will launch a pilot study of the predictive validity of the SAT with the partnership of colleges and universities.
"We will launch" -- So they haven't actually launched even a "pilot predictive validity study" even though they've announced what they're going to do.
This study will allow us to gather early evidence of the validity of the SAT for predicting college performance. In particular, we will examine the relationship between high school grade point average (HSGPA) and SAT, and the incremental prediction of the SAT over HSGPA in predicting college English and math grades as well as first-year GPA. As sample sizes permit, we will also examine differences in predictive validity by student subgroups.
Disparate impact much?
National Predictive Validity Study
After the introduction of the redesigned SAT, we will begin an ongoing process of documenting the relationship of performance on the assessment to outcomes of interest. Beginning with the first cohort of students to have primarily taken the redesigned SAT (the entering college class of fall 2017), we will launch a longitudinal national SAT Validity Study in partnership with colleges and universities to examine the relationship between SAT scores and college outcomes such as GPA, course grades, persistence, and completion. We will conduct extensive validity analyses by subgroup. The timeframe for this work will be:
Before summer 2018: Institutions sign up to participate.
Fall 2018: Data file is received by the College Board, including completed data-sharing agreements.
2019: Validity study is complete and distributed.
Those interested in participating should contact firstname.lastname@example.org.
Concordance tables will compare old and new exam scores to enable admission offices to have longitudinal consistency in their behavioral models and to evaluate applicants that have taken the different exams.
So, the College Board won't be able to tell you whether a 500 on the new SAT is better or worse than a 500 on the old SAT until after hundreds of thousands of kids take the new one for real in April 2016.
Delivery Schedule for Concordance Table Concordance Available Redesigned SAT to current SAT May 2016 Derived Concordance:
Redesigned SAT to ACT
Is Coleman just making this up as he goes along and is hoping the psychometricians can eventually come up with data to support his intuitions? My guess is that Coleman's intuitions are less stupid than those of most figures in the education reform biz, but still ...