4. NWEA Update
Updated SC
Linking Study
Common
Core
New Items
Field Testing
Teacher
Evaluation
Virtual
Comparison
Growth
Norms
College
Readiness
5. MAP Aligned with Common
Core: Ongoing Process
The Bridge Years
(2012) State aligned versions built for
states who adopted CCSS +15%
Technology Enhanced Items
under development
(2013) New test versions being built to
reflect grade level standards
Stand Alone Field Testing
commences
The Early Years
(2010) A Review and Validation of the
Common Core Standards
PRESENTED TO CCSSO, ACHIEVE, AND THE NATIONAL
GOVERNORS’ ASSOCIATION
(2011)NWEA releases first Common
Core version of MAP-
• Hand-aligned- resulting in tight
alignment in each content area
6. Assessment
Version
Release Timing Alignment & Item Types
MAP Aligned with
CC, Version Three
MAP Aligned with
CC, Version Four
Fall 2012
Fall 2013
•Better content coverage.
•Deeper assessment of a
student’s depth of knowledge
(DOK).
•Increase item/test validity.
•More engaging test experience
for students.
Blended
Assessment
TBD Measures both proficiency and
growth . Release a new MAP
assessment solution based on the
CCSS that will provide kids,
educators and parents with both
on-grade performance and off-
grade performance information
MAP Aligned with Common
Core: Ongoing Process
7. School Year Implementation Phase
2011-2012 Transition Year
2012-2013 Transition Year
2013-2014 Bridge Year
2014-2015 Full Implementation
South Carolina CCSS
Timeline
8. Impact on Data and Reports
Functional Area Impact
Growth
Measurement
No impact, comparisons are made at the measurement scale level
and are not affected by changes in state standards
Projected
Proficiency
Updated SC linking study will continue to be used.
New linking study will be conducted once there is a sufficient
number of students who completed the CCSS (+15%) aligned state
Assessments.
Goal Level
Reporting
No impact, automatically aggregate the data by goal structure
Norms
Content-independent (used across multiple state’s standards)
Can use these norms for Common Core Assessment
DesCartes/
PGID
No impact, DesCartes/PGID will be available for the currently
licensed Assessments
9. NWEA Update
Updated SC
Linking Study
Common
Core
New Items
Field Testing
Teacher
Evaluation
Virtual
Comparison
Growth
Norms
College
Readiness
11. Technology-Enhanced Items With Interactive
Elements http://www.nwea.org/common-core-new-item-types-map
11
• Better content coverage
• Deeper assessment of a student’s
depth of knowledge (DOK)
• Increase item/test validity
• More engaging test experience
• Interactive elements supported in
July 2012
– Drag & drop
– Click and pop
– Hot spot
12. Common Stimulus Items
12
• Share a common item asset with other
items
• Chosen adaptively; if selected, a number of
items associated with the common stimulus
item are presented consecutively
• Common stimulus items available in July
2012
– Use passages as the common asset
– Allow deeper assessment of reading
comprehension
34. • When should our districts consider switching over to CCSS
MAP tests?
– You may change when it’s the best fit/time for YOUR
DISTRICT.
• Can we give SOME students the CCSS version and other
students the state version that we currently give?
– Yes. You may test any grade level you wish on Common
Core and any other grade levels on your state version.
• When must we inform NWEA of our choice to move to CCSS
or to stay with our state version?
– We would like to know your decision about one-two weeks
prior to your district’s downloading of the next season of
data.
FAQ’s about NWEA’s CCSS
35. • How will proficiency to CCSS be forecasted?
– Until students are scored under the new CCSS, we will use
the 40/70 cuts.
• How will Reports change if we switch to Common Core?
– Teacher reports and all other reports that list goal strands
will now list the goal strands as defined by the Common
Core instead of the strands aligned to your state goal
structures.
FAQ’s about NWEA’s CCSS
36. • Do the 2011 Norms apply to the Common Core aligned
tests?
– The 2011 norms are carefully constructed to be
independent of any specific test.
• How can we make the change to Common Core MAP tests?
– The process is simple. Just call/email Laura Riley or Sue
Madagan at NWEA
FAQ’s about NWEA’s CCSS
37. Recorded Webinar- Expires May 13, 2013
• A Guide to MAP and the Common Core for
Teachers: http://nwea.adobeconnect.com/cc1_resources/
• A Guide to MAP and the Common Core for Leaders:
http://nwea.adobeconnect.com/cc2_resources/
• Handout- Resources- Guide to Common Core and Measures of
Academic Progress (MAP): What Leaders Need to Know
Resources
38. Bridging the Gaps:
What gaps exist in your
district between what you
need and what’s provided?
39. • No K-2- New Literacy Movement in state
• Longitudinal Growth Measures- historical data
• Growth Projections
• Predictive Capabilities- Linking Studies (ACT and
future Common Core)
• Blended Assessment- growth and proficiency
• Links to Instructional Providers- Compass, E2020,
Study Island
• Transition Year
NWEA is Distinctive
Things to consider in SC…..
40. • Interim Assessment- optional
• Adaptability- accurate for high and low performing
students
– . Summative- only grade level results
• Norms- National Norms - 30 million kids (so far)
• Valid and reliable status and growth data necessary for
Teacher Evaluation- supports Teacher Effectiveness
Models
• GRD- Growth Research Data Base-most extensive
collection of student growth data in the country. 4.5
Billion pairs of test items and responses.
Things to consider in SC…..
NWEA is Distinctive
42. NWEA Update
Updated SC
Linking Study
Common
Core
New Items
Field Testing
Teacher
Evaluation
Virtual
Comparison
Growth
Norms
College
Readiness
43. 1. Selection of an appropriate test:
• Used for the purpose for which it was designed
(proficiency vs. growth)
• Can accurately measure the test performance of all
students
2. Alignment between the content assessed and
the content to be taught
3. Adjust for context/control for factors outside a
teacher’s direct control (value-added)
Three primary conditions for using tests
for teacher evaluation
45. Leadership Courage Is A Key
0
1
2
3
4
5
Teacher 1 Teacher 2 Teacher 3
Ratings can be driven by the assessment
Observation Assessment
Real
or
Noise?
46. NWEA Update
Updated SC
Linking Study
Common
Core
New Items
Field Testing
Teacher
Evaluation
Virtual
Comparison
Growth
Norms
College
Readiness
47. • I know my students didn’t make typical
growth . . . my school and my students
aren’t typical
– Need an apples to apples comparison to
demonstrate what is possible – A Proof
Point
Problems we can help you
solve
47
48. What if we could compare to
similar schools and students?
48
We identify your
schools and students
Identify all matching students from GRD
School Income, Urban vs. Rural Classification
Grade, Subject, Starting Achievement, Assessment
dates
Randomly select comparison group
49. • Compare your
student’s
growth to
similar students
in similar
schools
• Compare to
State cut scores
• Compare to a
catch-up
Growth Target
49
Virtual Comparison
Groups
(VCGs)
51. -6.00
-4.00
-2.00
0.00
2.00
4.00
6.00
8.00
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71
Students taking 10+ minutes longer spring than fall All other students
Ensure similar testing conditions since
they matter in the reliability of results
Mean value-added growth by school
52. • Graphs at classroom, school, district
level
–Two hours of remote support
–Tool with student and testing condition
data
• Optional
–Data Coaching
–Research Consultation
Virtual Comparison Groups
52
53. Problems we can help you
solve
• I think our students are learning more
now than before, but I don’t have the
time or expertise to prove it
–Need rigorous and defensible analysis of
your longitudinal growth and achievement
data
53
54. • Supports your work
– District and School Improvement
– Communication with Board, Administrators, Parents
and Community
• We provide you a defensible analysis of your
growth and achievement data
– Grades 2 through 8
– Contiguous grades
– Four or more years
– Fall and Spring
Learning Pattern Reporting
55. • We use all your Fall and Spring MAP data
– All students are linked and data is cleaned
– Broken into over-lapping four year blocks of data,
each being lagged by one year
What do we do?
55
2
3
4
5
Grade
F S F S F S F S
2009 2010 2011 2012
School Year
F S F S
2007 2008
Data-block 1
Data-block 2
Data-block 3
56. • A statistical growth model is then fitted
including both achievement and growth
• Represents the “real” school
What do we do?
(cont.)
56
2
3
4
5
Grade
RIT
F S F S F S F S
2009 2010 2011 2012
School Year
Age-cohort
Grade-level
58. Findings reported using defensible
indicators about your patterns
2010
2011 2012
2010
2011
2012
59. • Report on patterns
found in your data
– Four hours of
remote support
– Tool you can use
• Optional
– Data Coaching
– Research
Consultation
Learning Pattern Reporting
59
60. NWEA Update
Updated SC
Linking Study
Common
Core
New Items
Field Testing
Teacher
Evaluation
Virtual
Comparison
Growth
Norms
College
Readiness
62. • Demonstrate CCR expectations
for all students
• Develop high-quality plans to
implement a system of
differentiated recognition and
accountability and support for all
Title 1 districts and schools
4 Principles for ESEA
Waiver approval
63. • Commit to developing, adopting,
piloting and implementing teacher
and principal evaluation systems
that support student achievement.
• Provide assurance that you will
evaluate and, if needed, revise
administrative requirements to
avoid duplication and unnecessary
burden to the district.
4 Principles for ESEA
Waiver approval
64. • Arne Duncan currently has
approved the waivers through the
2013-14 school year. At that
time, districts may request
extensions.
ESEA Waiver Timeframe
65. • Annual Measurable Objectives
(AMO) are now based on
school level mean scale scores
rather than the percent of
students meeting their
proficiency targets.
• (see graphic on next screen)
SC ESEA Waiver Study
66.
67. • As the AMO increases over the
next several years, so will the
mean RIT scores increase to meet
the escalating demand.
• NWEA correlation tables include
data through the 2017-2018
school year.
AMO increases
69. Example
Table 2 (below) contains the 2012-
2013 elementary-level AMO targets
and the associated NWEA RIT
targets by grade for a school.
While this study is not able to directly estimate the probability of an
elementary school meeting its AMO target, the school could be considered on
track for success if all students in grades three, four and five have NWEA RIT
scores greater than 206, 212 and 224 respectively.
78. NWEA Research
The NWEA Research Team created an alignment
study between students who have valid MAP scores
and also have valid Explore, Plan and ACT scores.
The results showed a correlation between ACT
Entrance scores, MAP RIT scores and the year-to-
year growth path to achieve the desired Entrance
score.
79. Where did the numbers
come from?
• Active NWEA districts that use EXPLORE,
PLAN, and ACT
• ACT data was matched to corresponding MAP
data at the individual level
• No formal sampling strategies employed other than
to cut extreme residuals
84. ACT Says:
• The ACT composite “entrance” scores used are
scores of students who the ACT data indicates
have a 50% likelihood of achieving a “B”
average in a freshman-level course.
• The demands of the courses are different in
various post-secondary institutions.
89. Student “Paths”
Introducing Three Normal Students,
and their (potential)
Postsecondary Paths
Theodore Thirdgrader
Sandra Seventhgrader
Nate Ninethgrader
91. Theodore Thirdgrader’s
Path
NWEA data indicates that for Theodore to achieve the
entrance Composite ACT score for these institutions, his
spring RIT score should approach:
Entrance ACT
24
Spring RIT
213 (78th %ile)
Entrance ACT
29
Spring RIT
224 (99th %ile)
Entrance ACT
32
Spring RIT
229 (99th %ile)
92. Another Way to Look at Third
Grade Spring RIT Scores
Average ACT Composite
entrance score for an
Education Major is 20.8
Third grade spring RIT for
a student on a 20.8
trajectory is 209
Average ACT Composite
entrance score for an
Engineering Major is 23.7
Third grade spring RIT for
a student on a 23.7
trajectory is 219
93. Nate Ninthgrader’s Path
NWEA data indicates that for Nate to achieve the entrance
Composite ACT score for these institutions, his spring RIT
score should approach:
Entrance ACT
24
Spring RIT
237 (53rd %ile)
Entrance ACT
29
Spring RIT
246 (71st %ile)
Entrance ACT
32
Spring RIT
251 (79th %ile)
94. Another Way to Look at Ninth
Grade Spring RIT Scores
Average Composite
entrance score for an
Engineering Major is 23.7
Ninth grade Spring RIT
for a student on a 23.7
trajectory is 254
101. Thank you for
your continued
partnership with
NWEA.
Sincerely,
Your NWEA Team
Laura Riley
Sue Madagan
Alison Levitt
Andy Hegedus
Editor's Notes
multiple choice items hand aligned by the NWEA Content Services Team - This has resulted in a tight alignment to the standards at a granular level in each content areaItems directly aligned to grade level standard(s)Assures item measures appropriate content and is appropriate for item poolAllows us to do a gap analysis finding standards with few alignments. MAP is M/C only.Guides new item development and pool enhancementAllow us to demonstrate alignment Items in pool have no grade association—it is a MAP pool
The purpose of this slide is to show our ongoing progression of CCSS aligned MAP – and the introduction of our new Blended Assessment solution.
Assessment alignment to curriculum/instruction is fundamental – performance may be affected if there is misalignmentWhat about or will we see a drop in RIT scores? - Short video
Below are a list of questions that you can choose when appropriate for your partner:How will teachers and students understand where and how learning is happening?How will the Consortia support teachers? Are the online resources the only planned offering? When will the Consortia have outside evidence that their systems are accurate?What is the plan for scoring performance assessments? What research will support this scoring?Are scores from SBAC’s short and long test versions intended to inform the same decisions? How precise will the underlying scores be for supporting/informing decisions?
Possible to use INTEL tool to rank order these??? - http://www.intel.com/content/www/us/en/education/k12/thinking-tools/visual-ranking/overview.htmlShort videoAdditionally – PD to support use of data, strategies & techniques for differentiating and embedding formative assessment minute-to-minute & day-by-dayStandards are very skill based – curriculum is what makes our work unique – individualized to our students and our students’ needs – supportive of personalized learning environments; More focus, more coherence, and more rigor – building habits of mind
These features are very relevant when competing in the Consortia world – SBAC and PARCC are missing most all of these distinctions.
CUSTOMIZE THIS SLIDE – either PARCC OR SBAC.Below are a listed of suggested questions that you can choose to use as appropriate for your partners:When will Consortia states have a valid method to predict performance on summative assessments?When will the Consortia states have cut scores?What are the plans for scale development?How will states get longitudinal growth data?How will teachers determine growth targets for every child?How will teachers and students track improvement and learning during the year?Where are SBAC and PARCC’s research plans?How soon will norms be available? Are there plans to develop growth and status norms?
Green line is their VA estimate and bar is the error of measureBoth on top and bottom people can be in other quartilesPeople in the middle can cross quintiles – just based on SEMCross country – winners spread out. End of the race spread. Middle you get a pack. Middle moving up makes a big difference in the overall race.Instability and narrowness of ranges means evaluating teachers in the middle of the test mean slight changes in performance can be a large change in performance ranking
We heard from many of you that your situation is not exactly typical and you wanted a way to compare the growth of your students to something that was matched to your situation – an apples to apples comparison. That’s why we created Virtual Comparison Groups or VCG’s. {Click}This is a sample report at the classroom level that shows individual students and how they did compared to similar students in similar schools and compared to the state proficiency standards.The black is the actual growth of each student. The grey is the growth of a randomly selected group of students matched on both student and school criteria like grade, subject, starting achievement, school free and reduced lunch percentage, and school location – is it an urban school or a rural school?The green and blue lines provide a reference point to how your state classifies your student’s achievement.VCG reports like these, and ones showing higher levels too, come along with a simple electronic tool for further exploring the data.
Whether showing individual students in a classroom or grades in a district, think about how VCGs can change the conversations. When comparisons are apples to apples, people now know what is possible. Conversations can focus on individual students or on grade levels as part of one on one conversations or as part of a school improvement process.I can explain more about VCGs if you want to see me later or look for an email that’s coming in early April with more information.