1
TN Growth Measures Planning Document
DEVELOPMENT TEAM
:
PK
–
Grade
3
DATE
: February 11, 2011
INSTRUMENT SUGGESTION (include
only one per document):
STAR Math™ and STAR Math Enterprise
Each Development Team should consider and answer the following questions for EACH instrument they recommend to the TN Departm
ent of
Education.
Completed planning documents and evidence documentation should be emailed to
heather@educationfirstconsulting.com
.
**If the team is recommending a composite value

added score and/or a school

wide value

added measure, please go directly to number 19.
INTRODUCTION
Based on more than 20 years of experience
using classroom assessment, Renaissance Learning™ advocates the responsible use of interim
assessment for evaluation and instructional improvement. As a result, Renaissance Learning submits
STAR Math
and
STAR Math Enterprise
for
Tennessee to consider as a
measure of growth for students in kindergarten through grade 12.
The STAR Math and STAR Math Enterprise assessments are computer

adaptive interim assessments that measure mathematics skills and
produce immediate results for use by administrators, teache
rs, students, and parents.
STAR Math utilizes a proprietary method
for measuring student growth called growth norms; growth norms allow for comparisons of student
growth across an academic year to students with similar levels of achievement. In addition,
the Web

based platform allows for easy roll

up of
data to any level of reporting. Furthermore, the Tennessee Department of Education could use the underlying growth norms util
ized by STAR
Math to customize the growth expectations for Tennessee schools.
STAR Math has been reviewed by the technical review committee of the U.S. Department of Education’s National Center on Respon
se to
Intervention (NCRTI) and has been given the highest marks for measuring growth and progress monitoring. STAR Math can be used
three to eight
times per year as short

cycle, interim assessments, or up to weekly in intervention settings.
The STAR Math Enterprise assessment is Renaissance Learning’s newest, most powerful version of the STAR Math assessment. It t
akes all the
strong
psychometric and statistical properties of STAR Math and enhances those capabilities by providing extended testing and report
ing
capabilities. Coming in May, 2011, interim assessment offers expanded skills

based testing that provides more information than
ever before and
Please provide an answer to #18.
No other specific changes are needed at
this time. Please refer to the
General
Suggestions
listed in the email
sent by
Heather Gage on March 3
.
2
continues to ensure that highly accurate scores are obtained in less than 15 minutes, with only 34 multiple

choice questions. It adds valuable new
features such as broader objectives, enhanced reporting, learning progressions, and more rob
ust item banks. STAR Math Enterprise also includes
innovative reporting capabilities. For example, a new report specifies which students need to learn which materials. In addit
ion, an online search
engine puts teachers in contact with instructional resourc
es such as teaching activities, prerequisite information, and sample question items.
Other reports show student proficiency on the Tennessee Curriculum Standards and the Common Core State Standards.
QUESTIONS FOR ALL DEVELOPMENT TEAMS:
1.
Does the instrument
provide a valid and reliable academic score that would measure student growth? Provide documentation as
evidence. (e.g. Score Achievement Definition/Level(s), Scale Score Scale definition and example, Technical Manual, Technical
Research
Studies, etc…)
The
STAR Math
score scale was specifically designed to follow students’ achievement growth both across an academic year and across
contiguous academic years. The score scale was constructed with measuring student progress in mind.
The basis of the STAR
Math score scale is the large item pool with over 2,000 items ranging in difficulty from kindergarten to high school.
Items were calibrated using modern item response theory, with the Rasch model as the basis for calibrating item difficulty. I
tems were
dev
eloped using a rigorous methodology, which can be found in the technical manual chapter in the Content and Item Development c
hapter.
Evidence of the alignment of STAR items to the Tennessee standards is provided in the reference material and is comprehensi
ve across all
grades. STAR Math has a good alignment to the Tennessee Curriculum Standards.
After items were developed, they were field tested and calibrated. Rigorous statistical and psychometric criteria were used f
or item retention
into the operational
item pool, which are outlined on pages 23

24 of the technical manual. Descriptions of the sample that were used in the
initial calibration phase are provided in the technical manual. Because STAR Math was developed through monitoring student pr
ogress and
growth over time, the initial item calibration methodology also included a design to vertically scale the STAR Math score sca
le across grades.
This was done to provide a continuous score scale across contiguous grades to track student growth in mathematics
. The vertical scaling
methodology is outlined in the technical manual in the Item and Scale Calibration chapter.
Items are also continually updated throughout the academic year with a dynamic calibration process that allows for the field

testing and
cal
ibration of new items throughout the academic year. The same rigorous item retention rules are used when updating the item po
ol yearly.
This ensures that only high quality items make it into the operational item bank.
Because of the large range of student
math achievement that can be encountered in the classroom, items were developed with the intention
of having items of sufficient ease and difficulty that students of all levels of math performance could be challenged by. Ite
ms range in
difficulty from ver
y easy items appropriate for kindergarten students just learning their numbers all the way into the high school level
algebraic evaluation of polynomial functions.
3
Renaissance Learning undertook three essential steps to improve and enhance the capabilitie
s of STAR Math; the result is
STAR Math
Enterprise
. With this innovative software, educators are able to obtain vital skills information for each student, each time he or she
takes an
8

to 15

minute test. Here are the three steps we undertook during the d
evelopment process:
1.
We developed learning progressions for math.
Creating learning progressions involves placing items in a meaningful sequence that
accurately represents the pathway to learning more and more difficult
—
though interrelated
—
skills. This approach presumes that
each skill is part of a continuum and that ea
ch skill has prerequisite skills associated with it, as well as skills for which it, itself, is a
prerequisite. Our first step in creating each learning progression was to extensively examine state and national standards an
d other
widely

accepted framework
s. We also
reviewed and learned from the work of independent educational organizations such as
Achieve. Further, as the Common Core State Standards entered into various stages of completion, we carefully monitored them i
n
draft form and provided public com
mentary.
Next, we used this information to craft tentative learning progressions. Finally, we
solicited thorough reviews from recognized content

area experts and adjusted the learning progressions based on their input.
2.
We calibrated each STAR test item, p
inpointing item difficultly in relation to every other item in the corresponding item bank.
This
step involved extensive field

testing and an adjustment of item questions if they did not assess students as intended.
3.
For each of the STAR assessments, we map
ped the scaled scores to the learning progressions by equating the scales.
As a result,
one piece of information (the student’s scaled score) automatically points to a coinciding, or mapped, piece of information (
the ability
level of the student, including
information on which skills the student is proficient in and which skills the student should strive to
acquire next).
STAR Math has been reviewed by the U.S. Department of Education’s National Center on Response to Intervention (NCRTI) and has
been
found
to have the highest of ratings as both a screening measure, which can be used from 3 to 8 times a year for short

cycle interim testing,
and for monthly or weekly progress monitoring for students who are believed to be at risk and need more frequent assess
ments. The review
by the NCRTI indicated that STAR Math obtained the highest rating in 10 out of 10 areas of evaluation (please note that at pr
esent the NCRTI
website does not reflect the recent change of status on the predictive validity of the slope gett
ing the highest mark, but will be updated at the
next ratings change for all measures in the Spring of 2011). For instance, STAR Math obtained top ratings in reliability of t
he slope, predictive
validity of the slope, sensitivity to student improvement, an
d end of year benchmarks. Additionally, STAR Math got the highest rating for
disaggregated reliability and validity evidence for progress monitoring with breakouts of student race/ethnicity for which ST
AR Math was
rated highly for all groups and showed sim
ilarly high psychometric qualities with all racial/ethnic demographics.
STAR Math Enterprise
will have the same, if not better, properties of the STAR Math test today. The STAR Math Enterprise test will be an
enhancement of the STAR Math test in that it
will have more items per test for the student to respond to, thus increasing the reliability of
scores. In addition, there will be a wider range of important objectives being assessed at all grade levels to align not only
with state standards
(see the atta
ched document on the alignment with TN standards) but also with the increasing use of standards from the Common Core State
Standards. Therefore, STAR Math Enterprise will have the same scale as the STAR Math test, but it will have an increase of it
ems admi
nistered
per test and a wider range of standards

based objectives in the operational item bank.
2.
Is the instrument valid and reliable? Provide documentation as evidence for each. (e.g. Technical Manual, Technical Researc
h Studies,
Face validity evidence
, etc…)
4
The
STAR Math
technical manual provides a chapter on the evidence of score reliability and the evidence of validity. Split

half, test

retest,
and generic reliability estimates are provided for each grade and for all grades. Evidence indicates that overall generic rel
ia
bility was .95,
split

half was .92, and test

retest was .91 with an average of 7 days between testing occasions. Results by grade show a range of generic
reliability from .89 in grades 4 and 5 to .93 in grades 10, 11, and 12; split

half ranging from .88 in
grade 1 to .91 in grade 12; and, test

retest
ranging from .80 in grades 8, 10, and 11 to .90 in grade 12.
Evidence of validity was also found (see the technical manual) with numerous correlations to external tests, including state
tests of end of
year s
tandards for accountability. Evidence provided in the technical manual highlights the numerous external measures for which
correlational evidence has been obtained. At present there is a large amount of data supporting the linear relationship betwe
en other
tests
of mathematics at all grades, kindergarten through high school, and with state tests of accountability. With respect to concu
rrent validity
estimates with external tests, there are 307 individual coefficients with an average correlation of .72. Ther
e are also 61 concurrent validity
coefficients on end

of

year state tests of accountability for grades 3 to 8 with an average correlation of .73. Additionally, there are 119
predictive validity coefficients for STAR Math predicting end

of

year state standa
rds tests in grades 3 to 8 with an average correlation of .68,
and these predictive studies range in time span from 3, 6, 9, and 12 months prior to the state accountability test.
The NCRTI, as stated in #1 above, has also reviewed STAR Math and provided t
he highest ratings on issues related to score reliability and
validity. Reviews for both Screening tools and Progress Monitoring tools resulted in the highest ratings for all of the relia
bility and validity
criteria. Additionally, evidence of score reliabi
lity and validity for different racial/ethnic groups was found indicating that STAR Math
measures are appropriate for diverse student populations. As of January 2011, 38 research pieces support the effectiveness of
STAR Math,
and 15 of them were led by ind
ependent researchers. Additionally, the Southwest Educational Development Laboratory (SEDL) provided a
favorable review of the reliability and validity of STAR Math scores.
STAR Math Enterprise
was developed by content knowledge and state standards exper
ts, in conjunction with highly qualified
psychometricians. It will be available in May, 2011 and is an enhancement and extension of STAR Math. The enhancement will in
clude
broader reporting capabilities, larger item pools of diverse content standards that
have evolved over the past few years, and slightly longer
tests, which should enhance the already high score reliability. STAR Math Enterprise will be an extension of STAR Math and is
expected to
have as good, or better, evidence of score reliability, as t
he tests will be longer, and score validity due to increased test length and expanded
coverage of objectives.
Attached to this document is a copy of the present alignment of STAR to the Tennessee Curriculum Standards, which
shows a strong alignment; this a
lignment will be enhanced over time. In addition, STAR is aligned with the Common Core State Standards.
3.
Could the instrument be used for a statewide standardized administration? Provide documentation as evidence. (e.g. Administra
tion
Manual, Administrat
ion Criteria, Effective Standardized Practices for Administration, etc…)
STAR Math Enterprise
—
as well as
STAR Math Renaissance Place
, which
, like STAR Enterprise,
utilizes Renaissance Learning’s hosting
services
—
is ideal for use as a statewide standardized administration. That’s because the computer

adaptive test is administered through a
Web

based platform called Renaissance Place Real Time. Renaissance Place Real
Time is deployed in a hosted

only configuration from the
Renaissance Learning Enterprise

Class Data Center, which contains network servers operated and maintained by the company. Renaissance

5
provided Web hosting has a number of advantages over local netwo
rk servers, three of which are cost, reliability, and security. In addition,
the centralization of data makes it possible to aggregate data for analysis, as well as to compare students’ scores to those
of students
nationwide. And finally, the centralizatio
n of data allows educators in leadership roles to compare data both over a geographical range and
longitudinally over a number of years.
Due to the power of Renaissance Place,
all of the
schools within a district can share a central database, which can al
so contain data on other
Renaissance products, such as Accelerated Reader™ and Accelerated Math™. With Renaissance Place, teachers and administrators
can
manage data and create reports from the student level all the way up to the district level. Teachers a
nd administrators can customize reports
to provide results for only a specific sub

group of children (e.g., students receiving free or reduced lunch) to allow for disaggregated progress

monitoring reporting.
Renaissance Place data resides on a centralized
server in one single database per district. It is accessible to authorized users, password
protected, and tailored to each user. A centralized data management system makes it easy to add and maintain student informat
ion because
the information is imported
only once and updated in one central location. It’s also easy to add and maintain other Renaissance programs as
they become available. The software is installed one time, and the most up

to

date version is always available to every computer on the
system.
At present, Renaissance Learning hosts tens of thousands of schools nationwide, including in Alaska and Hawaii, along with nu
merous
schools in foreign countries such as Canada and the United Kingdom, and Department of Defense schools in numerous countrie
s. The hosted
database, in just the 2010

2011 academic year, has over 5 million individual test records for just STAR Math that have been taken. This does
not count the hundreds of thousands of tests that are taken on all Renaissance Learning products, suc
h as STAR Reading, STAR Early Literacy,
Accelerated Reader, Accelerated Math, Math Facts in a Flash, etc., every day and year. In Tennessee alone, 1,250 schools are
active users of
at least one Renaissance Learning assessment program. In the present school
year, 2010

2011, approximately 370 schools are actively using
STAR Math and over 43,000 students from over 140 schools took at least one STAR Math Test on the hosted platform.
Additionally, historical data is always available and easily retrievable for b
uilt

in reports, customizable reporting, and extraction for data
analysis. Renaissance Learning has the strategic and technical facilities to expand and scale up this data warehousing capabi
lity to meet any
level of utilization. The capability of the STAR
assessments under a hosting arrangement can very easily accommodate an entire state’s worth
of data.
The database warehousing capabilities coupled with the computer

adaptive administration and scoring of student tests allows educators and
administrators
to evaluate test results as soon as the students have finished. Reporting on the student and larger aggregates like the
classroom, grade, school, etc. is virtually instantaneous. In addition to providing uniquely real

time results, this capability also pro
vides
critical management functionality for centralized monitoring of compliance with testing occasion rules. For instance, adminis
trators can log
on to the system and view the extent to which schools or teachers have followed the intended testing regime o
r evaluate how many of a set
of students have been tested at a specific time in the testing window. This capability allows for evaluating the extent to wh
ich testing is
proceeding across a district or across a state. This can be invaluable when monitoring
the compliance with testing rules at a state level, as
travel to inspect compliance across the state can be expensive and time consuming.
6
Attached is a copy of the pre

test, administration procedures for STAR Math. These instructions are used to prepare
the students for their first
assessment on STAR. The educator who will be proctoring the testing session provides these instructions to the group as a who
le.
Additionally, when students begin the testing session, they are provided with a series of practice
items to ensure they are able to respond to
the items in the computerized format and able to demonstrate a very basic level of competence in the subject matter for the t
est to proceed.
As STAR Math is a computer

adaptive test, all students across the stat
e would be provided with a standard administration of the actual test,
as all aspects of the test would be delivered by computer and scored by computer. There would be none of the usual variabilit
y seen with
individual administration and scoring of tests t
hat could degrade the measurement properties of the tests.
4.
Could the instrument be implemented in all classrooms statewide? If yes, what resources would be required? Provide documenta
tion as
evidence. (e.g. Technology required or not, Costs per student
for administration, scoring, and reporting, Time considerations for
administration, collection, and reporting)
STAR Math
and
STAR Math Enterprise
can be implemented in all classrooms statewide.
The classrooms must have Internet

enabled
computers,
and
the major Internet browser applications are supported for use.
Renaissance Learning already hosts tens of thousands of
schools across the country and processes millions of test records for all assessment products, so scaling up to serve an enti
re state wou
ld be
well within the technical capabilities of the hosted software platform. Utilizing Renaissance Learning’s technical capabiliti
es would allow for
an efficient method to meet the needs of an entire state because administration, scoring, reporting, and d
ata warehousing of critical,
confidential data would all be done from the hosted software application.
In addition, Renaissance Learning offers a variety of resources to help schools and districts install and effectively use the
ir technology. These
resour
ces include program management services, professional development, the Renaissance Training Center™, and technical support.
Trained experts are available to help customers, should any difficulties arise, by either talking over the phone or chatting
over th
e Internet.
STAR Math
costs $0.99 per student, per year.
STAR Math Enterprise
costs $2.49 per student, per year. The administration of the test takes
less than 15 minutes. The software automatically and instantly scores the tests and generates reports. Th
ese features make the STAR
products extremely practical for implementation in every classroom.
5.
Does the instrument measure content that represents essential instructional objectives? Provide documentation as evidence. (e
.g.
Instrument to TN Curriculum com
parison and alignment, Depth of Knowledge reporting and alignment, etc…)
STAR Math
measures instructional objectives in math to determine students’ general math achievement. It also provides prescriptive
recommendations for skills students need to practic
e, including precise exercises for each student.
STAR Math Enterprise
measures instructional objectives that have been aligned to the Common Core State Standards and placed on a series
of learning progressions for math. These objectives fall into four broad domains: numbers and operations; algebra; geometry
and
measurement
; and data analysis, statistics, and probability. Within each domain, skills are organized into sets of closely related skill
s. The
resulting hierarchical structure is domain, skill set, skill. There are four math domains, 53 skill sets, and 558 grade

spec
ific skills. The content
7
for STAR Math Enterprise is based on analysis of professional standards, curriculum materials, test frameworks, and content

area research,
including best practices for mathematics instruction.
STAR Math Enterprise is aligned to the
Tennessee Curriculum Standards and the
Common Core State Standards. Please see the attached documents for more information.
The chart below details the 53 skill sets measured by STAR Math Enterprise. The complete list of 558 grade

specific skills is avai
lable upon
request.
Domain
Skill Set
Grades
Numbers and Operations
Count with objects and numbers
K
–
2
Identify odd and even numbers
2
Relate place and value to a whole number
1
–
7
Add and subtract whole numbers without regrouping
1
–
3
Add
and subtract whole numbers with regrouping
1
–
4
Multiply whole numbers
2
–
6
Divide whole numbers without a remainder in the quotient
2
–
5
Divide whole numbers with a remainder in the quotient
4
–
6
Identify, compare, and order fractions
3
–
5
Add
and subtract fractions with like denominators
4
–
5
Find prime factors, common factors, and common multiples
6
Add and subtract fractions with unlike denominators
5
–
6
Convert between an improper fraction and a mixed number
5
Relate a decimal to a
fraction
4
–
6
Relate place and value to a decimal number
4
–
8
Add or subtract decimal numbers
3
–
6
Divide a whole number resulting in a decimal quotient
6
Multiply and divide with fractions
6
–
7
Multiply and divide with decimals
5
–
7
Relate a
decimal number to a percent
6
–
7
Solve a proportion, rate, or ratio
5
–
7
Evaluate a numerical expression
7
–
11
Perform operations with integers
7
Determine a square root
7
–
8
Solve a problem involving percents
7
–
8
Algebra
Relate a rule to
a pattern
3
–
8
8
Determine the operation given a situation
3
–
4
Graph on a coordinate plane
5
–
11
Evaluate an algebraic expression or function
6
–
9
Solve a linear equation
1
–
9
Determine a linear equation
6
–
9
Identify characteristics of a linear
equation or function
8
–
9
Solve a system of linear equations
9
Determine a system of linear equations
9
Simplify an algebraic expression
8
–
9
Solve a linear inequality
8
–
9
Solve a nonlinear equation
9
–
10
Graph a 1

variable inequality
9
Geometry and Measurement
Relate money to symbols, words, and amounts
1
–
2
Use the vocabulary of geometry and measurement
K
–
8
Determine a missing figure in a pattern
3
–
4
Determine a measurement
1
–
9
Tell time
1
–
3
Calculate elapsed time
5
Solve a problem involving the perimeter of a shape
3
–
8
Solve a problem involving the area of a shape
5
–
10
Identify congruence and similarity of geometric shapes
K
–
10
Solve a problem involving the surface area or volume of a solid
5
–
10
Determine a missing measure or dimension of a shape
5
–
10
Data Analysis, Statistics, and Probability
Read or answer a question about charts, tables, or graphs
1
–
8
Use a chart, table, or graph to represent data
1
–
8
Determine a measure of central
tendency
5
–
7
Use a proportion to make an estimate
8
Determine the probability of one or more events
7
6.
What scoring metrics are used for this instrument? Provide documentation as evidence. (e.g. Scale Score, Scale, Raw Score, Ru
bric Score,
etc…)
STAR Math
and
STAR Math Enterprise
provide a number of norm

referenced scores. All scores are well described in the technical manual.
9
The primary score is generated from the responses of the student to the items, which have all been calibrated using the
Rasch model, and is
transformed to the Scale Score that spans a 0 to 1400 unit scale with unit intervals. The Scale Score is then used to access
numerous norm

referenced scores. The norming study is described in the technical manual.
Norm

referenced scor
es provided by STAR Math are as follows: a percentile rank (PR) that compares the student’s performance with other
students of the same grade and month of the school year; a normal curve equivalent (NCE) score that is a transformation of th
e PR value to
pl
ace it on an equal

interval scale; and a grade

equivalent (GE) score that expresses the grade and month of the school year for which the
student’s present performance was the median score in the distribution of students, thus providing a grade and month of
the academic year
for which the student’s present performance was the median. It is important to note that the GE score does not represent the
grade level at
which a student is capable of doing mathematics work. Rather, it expresses the student’s performa
nce in that one instance as typical of
students’ performance during a particular grade and month of the academic year.
7.
Does the scoring take into consideration all cognitive levels? Provide documentation as evidence. (e.g. Blooms Taxonomy, Lea
rning
Domai
ns, Knowledge Dimensions, etc…)
STAR Math
is a computer

adaptive test that can be taken by students in grades 1
–
12. It measures general math achievement so educators
can determine placement levels, monitor progress, and accurately forecast outcomes on hi
gh

stakes tests.
The above also applies to
STAR Math Enterprise
. In addition, STAR Math Enterprise assesses 558 grade

specific skills in four math domains.
Within each domain, skills are organized into sets of closely related skills. The resulting hierarc
hical structure is domain, skill set, skill. There
are four math domains, nine skill sets, and 558 grade

specific skills.
The question items of STAR Math Enterprise have been aligned to learning progressions for math. These learning progressions t
ake into
consideration cognitive levels from early numeracy through geometry.
8.
What performance (achievement) levels have been determined for the instrument? Provide documentation as evidence. (e.g. Advan
ced,
Proficient, Basic, Below Basic, Below Proficient, Maste
ry, etc…)
STAR Math
assigns each student a grade placement, a scaled score, a grade equivalent score, a percentile rank score, a percentile rank
range, and a normal curve equivalent score. Achievement levels can be determined by, for example, viewing the
percentile rank score, or
comparing the student’s current grade to his or her grade equivalent score.
STAR Math Renaissance Place
(STAR Math that is hosted in
Renaissance Learning’s Data Center) has advanced reporting features that contain performance leve
ls. For example, the Screening Report
graphically depicts how many students fall into each of the following categories: At/Above Benchmark, On Watch, Intervention,
and Urgent
Intervention. Renaissance Learning provides default cut

off scores for each of th
ese categories, but customers can adjust these cut

offs, if
they desire. The report also details which students fall into which categories. Another STAR Math Renaissance Place report th
at gives
performance levels is the Annual Progress Report. This report
graphically shows how a student
—
or a class
—
is performing over time. A trend
line shows whether the student or class currently has a low risk, has some risk, or is at risk, as well as whether the progre
ss of the student or
class is trending upward and likel
y to put the student or class into a higher performance level in the near future.
10
STAR Math Enterprise
has all of the reporting capabilities as described above.
Moreover, since Renaissance Learning has linked the STAR
Enterprise tests with the Tennessee s
tate test (TCAP) using an equivalent groups equipercentile equating procedure, we are able to obtain an
estimate as to what STAR scores correspond with the cut

offs for the TCAP proficiency categories for grades 3
–
8 in both reading and math.
Those estimate
s allow teachers to gain a sense of what achievement level each student is in, just by having them take a STAR test and
viewing the results. Moreover, administrators can gain a sense of what achievement levels groups of students are in, broken d
own by clas
s or
grade. These results appear in the Performance Reports.
Additionally, the Screening Report includes a Tennessee report with performance levels that match the TCAP. Schools and distr
icts have the
option to create their own performance levels within th
e Screening Report. In addition, the State Standards Report indicates whether
students or groups of students are on target, or not on target, to meet the proficiency thresholds of the Common Core State S
tandards.
Another new report for Enterprise is the Fl
exible Grouping Tool, which groups students by their individual needs. Teachers can use this tool to
pinpoint which groups of students need to work on which specific math skills. Finally, the Longitudinal Report helps administ
rators see how
groups of stude
nts are progressing over time, including whether they are meeting district benchmarks; the same categories are employed as
in the Screening Report.
9.
Could an individual student growth score be calculated from the instrument’s score? Provide documentation
as evidence. (e.g. Scale
Score, Gain Score, rubric score, etc…)
As the primary score, the scaled score for
STAR Math
was developed from an explicit vertical scaling methodology; the score scale was
constructed with the measurement of growth in mind. The scaled score that represents student achievement is obtained from the
computer

adaptive testing methodology that is th
e heart of STAR Math. The final score on the test is presented as a transformation of the underlying
IRT model and placed on the score scale ranging from 0 to 1400. Thus, the scaled score can be used to represent student growt
h over time as
a simple differ
ence score or more elaborately as a time series representation when multiple measurements have been taken across time. At
present, the STAR Math application will provide a regression

based estimate of student growth when the student has at least five compl
eted
tests, which is based on a simple ordinary least squares optimization methodology. Furthermore, STAR Math provides a unique g
rowth norms
methodology to allow for comparisons of students’ growth rates to students across the nation that had a similar in
itial level of performance.
Growth norms will be explained more fully in #10 below.
In addition, numerous reports provide visual representations of student progress over time, from graphical visualizations of
a student’s scores
over time to tabular repr
esentations in table form. Moreover, the underlying data warehouse database structure allows for easy roll

up and
aggregation of sets of students. This makes it easy to aggregate a classroom, grade, school, or district that utilizes repeat
ed measures on
st
udents over time to track the progress of the aggregate units.
10.
What measure of growth could be used based on the instrument? Provide documentation as evidence. (e.g. Growth score, TVAAS,
norm
population, etc…)
11
STAR Math
utilizes a proprietary method of
growth norms for allowing the comparison of student growth rates to students of a similar grade
and achievement level based on our national database. At present there are over one million student records that provide the
basis for our
growth norms estimat
es. Growth norms are based on simple difference scores using the scaled scores. Thus, in addition to #9 above,
Tennessee
could construct their own measure of student growth by utilizing the underlying properties of the STAR Math IRT

based estimate
of stude
nt achievement through the scaled score, or
Tennessee
could also potentially profit from utilizing a nationally normed set of
expectations of growth. Additionally,
Tennessee
could utilize the same methodology underlying the growth norms to develop state

sp
ecific
growth norms and therefore customize the growth expectations for schools within
Tennessee
.
The growth norms are based on a student’s present grade and level of achievement, which we operationalize as the student’s pr
esent
standing with respect to
our nationally representative norms. At each grade there are 10 different distributions of growth, one for each decile
group. The reason for choosing this set of conditions was based on the fact that growth in mathematics was not only different
across
diff
erent grades, with more growth apparent in the lower grades than upper grades across an entire academic year, but also becaus
e of
noticeable differences within each grade. Students at different levels of math performance within the same grade will show di
f
ferent rates of
math growth. Therefore, the STAR Math Growth Norms take into account not only differences in student grade but also their ini
tial level of
performance. This allows students, when monitored across time on their math development and growth, t
o have their observed growth
compared to a similar distribution of growth based on a national sample of students in the same grade having a similar initia
l level of
mathematics achievement. Therefore, a student’s growth can be referenced against an appropr
iate normative distribution to make relative
comparisons about the student’s observed growth. For instance, this makes it possible to identify students who are making les
s growth than
normally observed with students of a similar grade and performance level
.
Additionally, the growth norms can be aggregated across larger units of importance. For instance, the growth of an entire cla
ssroom, grade,
school, or even district can be computed using the aggregation functions in the reporting features. Furthermore
, the growth norms are
presently used in the Goal

Setting functionality of the software, which allows for reasonable estimates of growth to be used when setting
goals for students across the academic year.
11.
Could data be collected at more than one point i
n time? Provide documentation as evidence. (e.g. Pre

Post test design, prior year
administration, multiple administrations, etc…)
The
STAR Math
and
STAR Math Enterprise
assessments allow for multiple administrations per student during the school year. Th
is is possible
both because the assessments are efficient
—
and thus practical in today’s busy classrooms
—
and because they were designed to be taken at
regular intervals to assess each student’s growth.
Teachers can use the STAR assessments three to eight ti
mes per year for screening and
progress monitoring purposes, or as often as weekly in intervention settings.
In addition to being constructed to evaluate students over time,
numerous reporting features are available to visually represent student progress o
ver time.
As the STAR assessments are computer

adaptive tests, they tend to provide highly reliable scores in a short amount of time, which facilitates
multiple administrations with very little disruption of learning time. Additionally, the item selectio
n procedure can optimally choose items at
the appropriate difficulty level for students to allow for an individualized assessment for every student based on their curr
ent estimated
12
achievement. STAR also has a large item pool of high quality items to ensur
e students can take tests on a weekly basis
for progress
monitoring
without depleting the pool of items targeted at the student’s performance level.
All of these three things taken together provide for a strong assessment that is built to be used repeate
dly across an academic year.
Therefore, multiple assessments can be given during the academic year to help provide educators with some idea of how their c
lassrooms or
schools or any specific user

defined group are progressing during the academic year. Educ
ators will not have to wait until the end of the year
to see how they have done and what types of growth they have achieved with their students. Educators will be able to administ
er the tests
frequently enough that they can obtain information about student
s’ growth by the mid

point in the school year at minimum. This can provide
an early warning system for educators and administrators to take action to correct the present course before the end of the s
chool year when
time has been lost and nothing can be do
ne. A common example used in schools across the nation comes out of the Response to Intervention
framework where a series of three benchmarking assessments are used for efficient and reliable universal screening of all stu
dents, and then
students whose per
formance is potentially at risk can be progress monitored monthly or bi

weekly for a period of time to evaluate their
growth. Furthermore, the proprietary growth norms can be used to provide a relative index to interpret the extent to which th
e student’s
g
rowth over that period of time is below expectation or within normal bounds and can be used proactively to set reasonable goa
ls for
students based on the normative distribution of known gains made by students of a similar grade and achievement level.
As
stated above, the National Center on Response to Intervention (NCRTI) independently reviewed STAR Math and verified that the
assessment can be administered multiple times throughout the year: 3
–
5 times per year when used as a screening measure and as often
as
weekly when used as a progress monitoring tool. STAR Math earned the highest rating of “convincing evidence” in the “Alternat
e Forms”
category, which required that assessments provide at least nine alternate forms that can be administered throughout a
single school year
with the same student, in which case the student would not receive the same “form” more than once. In the case of STAR Math,
the number
of equivalent “forms” or tests that a student would see is virtually unlimited due to the fact that S
TAR Math is an IRT

based computer

adaptive assessment with a large item bank whose content spans grades K
–
12.
Here is more information about the efficiency, practicality, and psychometric feasibility of retesting with STAR:
STAR assessments are quick.
A student can take a STAR test in less than 15 minutes. And, since all students can be tested at the same
time, the majority of class time can be used for instruction and practice, not testing. This ensures that academic learning t
ime is
maximized and tha
t short

cycle assessment does not result in an abridged curriculum.
STAR assessments are easy.
Educators find the STAR assessments extremely easy to administer. The tests are short and all students can
be tested at once. Plus, students can’t copy one anot
her’s answers, since each student is given different questions. Educators also find
STAR reports extremely easy to view, print, and understand. Administrators, teachers, parents, and students love to view the
reports that
are tailored to their needs. They
feel empowered by their increased knowledge of the student’s progress and their ability to make a
change and assess its effectiveness.
Students won’t see the same question twice.
The STAR assessments can be repeated often, due to the large number of items
in the item
banks. STAR Math will not repeat a test item that a particular student has seen within the past 75 days. Moreover, as student
s’ skill levels
13
increase, the assessment will provide them harder
—
and thus, different
—
question items. In sum, STAR tes
ts can be administered
frequently
—
as often as weekly
—
for progress

monitoring purposes throughout the school year.
Multiple
STAR Math Renaissance Place
and
STAR Math Enterprise
scores allow educators to see a student’s progress toward math goals.
After a s
tudent has taken one STAR test, teachers can use the Goal

Setting Wizard to set a moderate or ambitious goal for the student. Then,
by viewing the Student Progress Monitoring Report, the teacher can see a graphic depiction of the student’s current ability
level and the
student’s projected ability level, if the student progresses at the goal rate. After the student takes three more STAR tests,
the software
automatically draws a trend line that graphically shows the student’s rate of progress. By comparing th
e trend line to the goal line, educators
can assess whether the student is on track to meet the goal. If not, the teacher knows that it’s time to intervene (or try a
new intervention).
The teacher can then use the software to determine whether the new inte
rvention is effectual in helping the student achieve his or her goal.
Multiple STAR scores can also be collected and analyzed at the class level. The Annual Progress Report shows multiple STAR sc
ores of the
class as a whole. The software automatically draw
s a trend line that shows the rate of progress of the class and indicates whether the class
has a low risk, has some risk, or is at risk for acquiring the requisite math skills by the end of the school year.
Finally, one of the most exciting new reports f
or STAR Enterprise is the Longitudinal Report. This report compiles data from many test
administrations over long periods of time, allowing administrators to judge how well students have been performing during, fo
r example, the
past three years. This repor
t is especially useful to administrators who are working to ensure teacher effectiveness through research

based
evaluative measures.
12.
Is the instrument designed for secure administrations? Provide documentation as evidence. (e.g. secure design, secure del
ivery, etc…)
STAR Math
and
STAR Math Enterprise
are designed for secure administrations, since each student is given different test items and thus can’t
copy one another’s answers. Additionally, use of the software is password

protected and different stak
eholders
—
students, parents, teachers,
and administrators
—
receive different levels of access to the software.
Additionally, Renaissance Learning is committed to complying with the regulations of the Family Educational Rights and Privac
y Act (FERPA).
We do not disclose any student data. Any data we receive for the purpose of a research study is destroyed when the s
tudy is complete.
For six years, Renaissance Learning has been storing student data on its servers and working with that data for internal rese
arch and
development purposes in a manner that protects district, school, teacher, and student privacy. Currentl
y, well over one million student
quizzes and assessments are administered and stored on our secure servers every day during the school year. We currently host
demographic
and performance data for more than 15,000 schools, covering millions of students. We
employ strict policies and procedures regarding access
to, and handling of, school and student data. Consequently, in its entire 25

year history, Renaissance Learning has never had a breach of
security or unauthorized use of data.
Finally, Renaissance Lear
ning has implemented the following security measures:
14
Security
Implementation for Client Access and Data Encryption:
The URL
address to the hosted Renaissance Place site uses a secure
sockets layer (SSL) protocol for all communications over the Internet. S
SL is a cryptographic protocol that provides secure communication
across the Internet, preventing eavesdropping and tampering and also guaranteeing that data is only being communicated betwee
n two
authenticated parties, the customer and Renaissance Learnin
g. We use the same SSL communications that are widely trusted by leading
financial institutions for electronic commerce and online asset management. By default, each of our client applications also
connects
using SSL. Encrypting the data with 128

bit encry
ption ensures that the Internet traffic cannot be intercepted and read using any type of
modern sniffing or filtering software.
Control Measures and Processes to Safeguard Confidential Customer Information:
Entry to the Renaissance Learning headquarters
bu
ilding, which houses the Renaissance Learning Enterprise

Class Data Center, is controlled via employee magnetic key entry.
Entry into
the Data Center is also controlled via magnetic key entry for authorized Renaissance Learning hosted and networking person
nel only. In
addition, we follow strict protocols in logging and analyzing information about who has had access to the Data Center, both i
n person
and electronically. Additionally, a security/risk analysis is completed once a year at minimum by an external
auditing firm.
Method by which User Access Credentials Are Provisioned and Managed:
The software contains seven defined user groups, including
district administrator, district staff, school administrator, school staff, teachers, parents, and students. Eac
h person using the software is
assigned to one of these user groups and a specific position within that group. For example, a school administrator may have
an assigned
position of principal, assistant principal, or reading coordinator. Each person is then
assigned capabilities, which give users the right to
perform specific tasks in the software. The settings also allow different levels of access to the software setup and data. Fo
r example,
district administrators have access to reports for all levels
—
indiv
idual students, classes, teachers, schools, and the district
—
while a
teacher will only have access to view reports for his or her individual classes.
13.
How are students with disabilities provided equal access through the instrument? Provide documentation
as evidence. (e.g.
accommodations, modifications, etc…)
STAR Math
is suitable for use with students with disabilities. Since this assessment uses computer adaptive testing, the difficulty of
the test is
adjusted automatically to reflect the skill level of the student. Therefore, the assessment is likely to present items
that correspond to the
broad range of abilities of all students, including those with special needs. STAR Math minimizes frustration for students be
cause the
difficulty of questions is adapted based on a student’s response pattern. For example, if a stude
nt misses a question, the next question will be
easier. If a student answers a question correctly, the difficulty level will increase.
Here’s more specific information about how students with disabilities can use STAR Math:
Students with Limited Vision:
For students with limited vision, the introductory screens of the STAR Math assessment respond to the
High Contrast accessibility feature within Windows and the Switch to Black and White accessibility feature in MAC OS 10. In a
ddition,
the assessment scree
ns within STAR Math already provide visual contrast through a light background and black writing. Furthermore,
STAR Math is compatible with Mac OS 10’s Zoom In accessibility feature, which allows users to magnify nearly all STAR screens
. If
students are bl
ind, however, the STAR Math assessment may not be appropriate.
Students with Hearing Impairments:
STAR Math may be used with students who are deaf or hard of hearing using standard
administration procedures. At the discretion of the teacher or specialist,
STAR Math may be signed to the student if there is a
15
question about the student’s ability to understand the text. If this adaptation is used, the results of the assessment should
be
evaluated cautiously. It is reasonable to assume that the same adaptations
or accommodations permitted for standardized or state
assessment may be used with STAR Math.
Students with Limited Motor Skills
:
STAR Math offers accommodations for students with disabilities through the accessibility options
built into the computer’s op
erating system. For instance, students with limited motor skills can execute STAR Math’s mouse

related
functions through the keyboard when mouse keys is selected under
Accessibility Options
within the Windows operations system. The
Mouse Keys option is als
o available for the Mac under
System Preferences > Universal Access > Mouse > Mouse Keys.
Generally
speaking, the STAR Math assessment is compatible with the adaptive devices typically used by students with limited motor skil
ls. The
devices may be part of
the operating system or may be add

on devices such as cursor or keyboard controllers.
Federal IDEA regulations allow the use of Response to Intervention (RTI) processes in special education determinations (altho
ugh RTI is about
more than just special edu
cation determinations). In addition, the federally funded National Center on Response to Intervention (NCRTI) has
given the STAR assessments some of its highest ratings for screening and progress monitoring
—
meaning that they are among the best
assessment t
ools for RTI purposes, including use with students in special education programs.
STAR Math is keyboard based.
STAR Math Enterprise
has all of the capabilities listed above. Also, it supports both mouse and keyboard input.
14.
How are students exposed to
the format of the instrument prior to the administration? Provide documentation as evidence. (e.g.
Practice assessments, Item samples, Similar instrument integrated into instructional measurement, etc…)
The practice session before the
STAR Math
test allows students to become comfortable with the test interface and to make sure that they
know how to operate the software properly. Students can pass the practice session and proceed to the actual STAR Math test by
answering
two out of the three prac
tice questions correctly. If a student does not do this, the program presents three more questions, and the student
can pass the practice session by answering two of those three questions correctly. If the student does not pass after the sec
ond attempt, th
e
student will not proceed to the actual STAR Math test.
Even students with low math and reading skills should be able to answer the practice questions correctly. However, STAR Math
will halt the
testing session and tell the student to ask the teacher for
help if the student does not pass after the second attempt.
Students may experience difficulty with the practice questions for a variety of reasons. The student may not understand math
even at the
most basic level or may be confused by the “not given”
response option presented in some of the practice questions. Alternatively, the
student may need help using the keyboard. If this is the case, the teacher (or monitor) should help the student through the p
ractice session
during the student’s next STAR Math
test. If a student still struggles with the practice questions with teacher assistance, he or she may not
yet be ready to complete a STAR Math test.
The same is true of
STAR Math Enterprise
.
16
15.
Are there any noted unintended consequences? Provide document
ation as evidence. (e.g. Bias, Misinterpretation of Results, Restricting
curriculum, fairness, etc…)
STAR assessments are more than just monitoring devices: Incorporated into instructional planning, they play an active role in
improving
teaching and lea
rning. Based on more than 20 years of experience using classroom assessments in these ways, we advocate the responsible
use of short

cycle assessments for two purposes: (1) to provide additional data points in systems to evaluate educators (both teachers a
nd
principals) in a fairer and more timely manner, and (2) as sources of regular feedback to educators so they can fine

tune instruction, provide
interventions for struggling students, and identify opportunities to improve their own professional growth.
T
hese two uses of interim data
—
evaluation and instructional improvement
—
go together and should be mutually supportive.
The process will
not only identify effective educators, but also enable virtually all educators to become more effective. Because improvin
g student learning is
the ultimate goal, the use of interim data for evaluation should always be accompanied by its broader use as part of the inst
ructional process.
In addition, Renaissance Learning follows strict item

writing specifications including bi
as and fairness criteria that address stereotypes and
characterizations of people or events that could be construed as demeaning, patronizing, or otherwise insensitive. Content

development tools
track and report attributes such as gender, age, ethnicity, s
ubject matter, and regional references. Individual attributes, as well as the
intersection of multiple attributes, are tracked throughout the development process to ensure that final content is demograph
ically balanced
and free of bias. Assessment items mu
st also pass strict quality reviews that check for discipline

specific criteria, accuracy, language
appropriateness and readability level, bias and fairness, and technical quality control.
16.
How does the instrument impact instruction? Provide docu
mentatio
n as evidence. (e.g. d
ifferentiate instruction, improve student
achievement, direct student interventions, etc…)
The STAR assessments, including
STAR Math
and
STAR Math Enterprise
, meet the definition of interim assessments as developed and
approved by
the CCSSO, as well as by Race to the Top. The STAR tests are short

cycle, interim tests designed to provide feedback to adjust
ongoing teaching and learning and improve students’ achievement of intended instructional outcomes. The STAR assessments are
mean
t to
assist teachers and administrators as they engage in the formative process that takes place daily and throughout the school y
ear, not merely
at its end.
The STAR computer

adaptive assessments take students less than 15 minutes to complete. They measu
re mathematics skills and produce
immediate results for use by administrators, teachers, students, and parents. Educators administer the assessments based on t
heir
assessment purposes: screening and progress monitoring, as well as
—
for Enterprise customers
—
instructional planning, gauging mastery of
state standards, and forecasting performance on the TCAP with enough time to make adjustments, if necessary. The interim asse
ssments are
indispensible to schools interested in assessing their students in a formati
ve way, because they allow educators to assess students’ needs in
time to make a difference through instructional planning, instructional monitoring, and intervention.
17
For instance, educators can use
STAR Math Renaissance Place
and
STAR Math Enterprise
to
help them make screening decisions. With the
Screening Report, teachers identify which students are reaching benchmark and which students may need intervention. The Scree
ning Report
also provides data to help educators decide how to allocate resources acc
ordingly. The default reporting option generates this information at
the grade level to help with grade

level planning via data team meetings. Using winter screening data, this report helps educators evaluate
the effectiveness of the Tier 1, core program b
y keeping track of the students not receiving additional services. Using spring screening data,
this report also enables data

driven planning for the following school year. Professional development for STAR is grounded in RTI principles,
precisely to help
educators make decisions formatively throughout the year.
Another valuable tool included with STAR Math Renaissance Place and STAR Math Enterprise is the Goal

Setting Wizard, which allows
teachers to set an ambitious or moderate goal for each student, bas
ed on growth modeling trends from our large database of STAR users.
Progress toward that goal is then automatically tracked in the Student Progress Monitoring Report, so educators can see the e
ffect of
interventions over time and make adjustments as needed
.
For
Enterprise
customers, the Instructional Planning Reports help teachers and curriculum facilitators group students based on the skills th
ey
are ready to learn next, so they can make sure the students are getting regular practice and instruction at t
he right level and on the right
skills. Another useful family of reports is the State Standards Reports, which allow district and school administrators to ga
uge how well
groups of students are mastering the Common Core State Standards. Finally, STAR Perfor
mance Reports give educators
—
district
administrators, school administrators, and teachers
—
an early warning about which students or groups of students may be at risk of not
meeting Tennessee’s proficiency requirements.
17.
What are the barriers to using the
instrument (statutorily, regulatory, etc.)?
No barriers are known at this time.
18.
How has the team reached out to other educators in its representative category (grade/subject) for ideas? Please explain the
result of
that outreach.
IF THE DEVELOPMENT TEA
M RECOMMENDS
EITHER A COMPOSITE V
ALUE

ADDED MEASURE
(claiming of specific tested students)
OR A
SCHOOL

WIDE VALUE

ADDED MEASURE FOR THE EDUCATOR CATEGORY, THE TEAM MUST PROVIDE THE FOLLOWING INFORMATION (instead
of the questions above):
19.
Please provide a b
rief description of the process the development team took that led to the decision to recommend using either a
composite value

added measure and/or a school

wide value

added measure for evaluation purposes. Be sure to include:
a.
The rationale for this choice
and why other instruments are not being recommended.
b.
How the team reached out to other educators in its representative category (grade/subject) for ideas? Please explain the resu
lt of
that outreach.
18
NOT APPLICABLE
Comments 0
Log in to post a comment