June 2, 2014 by

A Few of PARCC’s Lessons from “Testing the Test”

Common Core 3 LessonsBy Callie Riley

Since late March and into early June, millions of students across the country have experienced something different and exciting. Although the two, state-led Common Core testing consortia—the Partnership for Assessment of Readiness for College and Careers (PARCC) and Smarter Balanced—have been working since 2010 to build these next-generation assessment systems, the tests formally debuted to the field in spring 2014. Perhaps the most exciting part of this debut has been testing the test; educators and students have been providing a formal feedback loop to state leaders who are working to analyze and refine the assessments before they become fully operational in the 2014–15 school year. That said, feedback is only as good as how it’s used—and PARCC states have been taking these lessons seriously.

So what have been some of the major lessons learned so far?

1. Know the items and their advances. As I mentioned in a previous post about the release of PARCC sample items in fall 2013, educators and students have benefitted from becoming familiar with the items prior to the field test. As with any computer-based test (although both consortia are field testing traditional pencil-and-paper tests as well), the elements that make the tests more engaging can also be a little intimidating. Drag and drop, highlighting, using an equation editor—these are just a few of the tools covered in PARCC consortium tutorials that allow both educators and students to learn the technology in advance. Although no student- or school-level data has been released from the low-stakes field tests, the tests have provided PARCC with an opportunity to begin introducing the technology and assessment advances early in the game. Additionally, educators and students can continue to use these tools to spark a conversation about where they’d like to see the assessments go in the future.

2. Remember why we field test assessments. Although the two consortia continued to remind the participants that each field test was really a test of the test and not of the students, that message was often lost in the conversations around continued administration of current state assessments, politics, and other hot topics. Educators from across the PARCC states—from Massachusetts to New Mexico—spent thousands of hours reviewing, revising, and debating more than 10,000 items that made it onto the field test. Their dialogues and debates centered on bias and sensitivity, alignment to the standards, and accessibility. With this incredible amount of expertise behind the development of the PARCC assessments, we must remember that a field test is about checking our assumptions that we “got it right” to ensure the 2014–15 school year roll-out will be as smooth as possible and that educators, students, and their families will find the information generated by the assessments truly useful. The field test has also provided an opportunity to gather narrative feedback from students and test administrators on what they liked, what they didn’t like, and what needed to change.

3. Follow the dialogue to ensure problems are fixed quickly. Perhaps one of the most fascinating parts of the field test was the ability to follow educator and student conversations across the PARCC states as they took the performance-based assessments from late March through early April and during the recent end-of-year assessment window from early May through early June. Educators have been incredibly transparent with and responsive to one another, especially on social media—sharing new information, quick fixes, and insights as the field test progressed. With this information, leaders in the PARCC states were able to pinpoint where and how often issues might be occurring in districts and schools. This information allowed the states to continue to improve the experience of both student field test takers and those administering the tests. As the #PARCCfieldtest hashtag emerged on Twitter, my colleagues and I were able to figure out how to respond to requests swiftly. We also learned how “on the ground” implementation was truly taking shape. Above all, the notes of encouragement to and from educators across the PARCC states were inspiring, capturing the true spirit of collaboration.

These are only a few of the lessons PARCC states learned during the field test, but there will be many more to come. As the full spring field test comes to a close, we’ll continue to monitor progress, take feedback, and incorporate that feedback into the improvements we make for next year’s full test. In order to ensure that those improvements happen, we will rely on those of you in the field to continue the open and honest dialogue as we move into full implementation during the 2014–15 school year. Join the conversation on Twitter by searching for #askPARCC or by following @PARCCPlace. To receive regular updates, you can sign up for the PARCC Updates newsletter to find out what’s happening—and what’s to come—across the states in terms of exciting policy and practice stories.

Even if you weren’t a part of the PARCC field test experience, you can see what makes the test different by taking the PARCC practice tests online at http://www.parcconline.org/practice-tests.