Student Growth
Student growth is a required component of T-TESS.
What does student growth measure?
Student growth measures how much a student progresses academically during his or her time with a particular teacher. It takes into consideration a student’s entering skill level when measuring how much the student grew over time, and, as opposed to measuring student proficiency on an assessment, student growth isn’t concerned with whether or not a student passes a particular test or reaches a predetermined and uniform benchmark. It considers equally students who enter behind grade level, on grade level, and beyond grade level, tailoring growth expectations to each student’s context.
By measuring growth, a teacher develops a better understanding about the academic impact of his or her instructional choices. In a formative appraisal process like T-TESS, feedback derived from student growth acts as a complimentary piece to the feedback derived from the appraisal rubric. Whereas the rubric captures how the teacher’s practice impacts students holistically, student growth captures how the teacher’s practice impacts students academically.
How should student growth data be used?
Student growth data should be used just as observation data and goal-setting and professional development data are used in T-TESS – as feedback that will help inform teachers about what worked, what didn’t work, and what they can do to improve their practice moving forward.
Student growth is one measure in a multiple-measure appraisal system, and the inclusion of student growth data in a formative appraisal process provides for a more complete understanding of the impact of instructional and professional practices teachers deploy over the course of a school year.
What are the options for measuring student growth for teachers?
Districts have options for measuring student growth including: 1) student learning objectives (SLOs); 2) portfolios; 3) district-level pre- and post-tests; and 4) value-add measures (VAM) for teachers in state-tested subjects.
Districts are free to choose any measure for their teachers – no single measure must be used for a particular grade or subject (e.g., VAM doesn’t have to be used for teachers of tested grades and subjects). Districts can also use different measures for different grades or subjects. For example, a district could use SLOs for elementary generalists, but portfolios for secondary foreign language teachers.
Value-added Measures (VAM) based on state assessments:
When considering the use of VAM, please note that there are multiple models that could be used to calculate VAM, and, depending on the entity using the model, similar models can take different names.
Research will capture both pros and cons for any given model of VAM a district could pursue, so districts are encouraged to weigh the relative importance of certain considerations when choosing a model. Some of those considerations could be:
- The feedback the data produces. Does it signal to teachers potential growth areas based on the entering achievement levels of students (how well low-achieving students progress in the teacher’s class, for example) or based on demographic data (how well do male students progress in the teacher’s class, for example)?
- The amount of information the model takes into consideration, such as prior testingdata
- The ease of calculation or explanation
- The ability to calculate VAM for certain tests (i.e., 4th grade, EOC, science, social studies)
In addition, districts will need to make a host of procedural decisions related to processing data and producing a VAM measure, such as:
- How many performance levels will the district use to capture teachers’results?
- Will the district combine results for teachers that teach multiple tested grades and subject? If so,how?
- How will the district handle shared responsibilities for teaching students (i.e., the primary teacher working with content mastery, or co-teaching situations)?
- What is the minimum number of test takers that would be able to produce a VAM result for a teacher?
- Are there certain conditions that would cause a student to be dropped from the data, such as a number of absences or an enrollment date late in the school year?
- How will the district handle teachers who are out for an extended period of time, such as those on FMLA leave, for example?
- Will student-teacher linkages be binary or dynamic? For example, if a student is enrolled with a teacher on a given date, will the teacher be responsible for 100% of the student’s results, or will the responsibility be weighted based on percentage of time the student is enrolled in a teacher’s course?
Due to the numerous considerations associated with using VAM, it is strongly encouraged that districts work with external partners with expertise in producing VAM results.
When will student growth data be available?
For most measures of student growth, data should be available by the end-of-year conference between a teacher and an appraiser. For value-add measures (VAM), that data will likely be available during the subsequent fall, depending upon the processes a district chooses if the district decides to use VAM as a measure of student growth.
The timing of some finalized student growth data should not be a barrier to implementing T-TESS as it’s designed. First, no single year student growth data should be the trigger in any substantial decisions a district or campus makes about a teacher. Student growth is one of multiple measures of a teacher’s practice, and decisions should take into consideration more than just single year student growth. Second, in a formative appraisal process like T-TESS, the timing of student growth data reinforces the ongoing loop between appraisal, feedback, and development. Discussions about a teacher’s practice should be ongoing and should evolve over the course of the year. Student growth data can be analyzed when available and should be taken into consideration when a teacher modifies or adjust his or her goals and professional development plan at the beginning of a new school year.
Will teachers be fired if their students do not demonstrate growth?
The idea behind T-TESS is to provide teachers with more information and support as they develop as educators, not to create a punitive system. Personnel decisions have always been district decisions and not something TEA promotes as the driving force behind teacher appraisal. That said, districts make personnel decisions based on multiple factors, and TEA will continue to communicate to districts that single-year student growth data should not be the sole factor in employment decisions.
*For more information on student growth measures, please view the TEA Student Growth Overview website