May I share two stories about test directors?
A test director was out walking when a high altitude balloon was observed descending to earth. After hours of being buffeted by high winds, the pilot was totally lost. The pilot spotted the test director and shouted, "Where am I?" "You are exactly 158 feet above ground level." came the response. "Must be a test director," muttered the pilot, "Accurate but totally useless data."
A test director, viewed as calm under fire, was asked for the secret. The test director responded by saying, "At the beginning of each difficult confrontation, I remember a paraphrased statement which was once used by the late Walt Hathaway. It said, "Yea, though I walk through the valley of the shadow of doubt, I will fear no nay-sayers because I have the best dame data in the valley!"
How good is our data? How good are the services which we provide to support our data? I can't think of any group in education which is more concerned about these questions than NATD members. During my ten years of NATD membership, I have continually been impressed and humbled by the dedication and intense concern of those who have a common interest in high quality educational assessment programs. As has been true of all who have preceded me as president of NATD, I want to do what I can to make NATD worthy of it's members. During 1995- 96, I am asking for your help as we continue to examine the "value added" by NATD to current and potential members.
An informal review of NATD membership and membership trends shows that:
I want to thank you for inviting a representative from the National Association of Test Directors to be part of this panel. Test Directors play a key role in educational assessment and are greatly impacted by the code -- they are involved in all the activities of the code but one (marketing).
Test Directors can also play an important role in getting the code implemented in their school districts. In fact, Test Directors are typically the only NCME member in their school systems. I couldn’t come up with a good analogy of the role that we could play so let me give you a less flattering one - we can be like computer viruses that take this code and, unbeknownst to those running our systems, copy it into system policies, procedures and training.
So, the code is very important to test directors and test directors are very important to the dissemination of the code. So what do I, as a test director think of the code? I was asked to address 3 questions so let me take each in turn.
First, I was asked how the code could be used to inform test directors. The good news here is that the code isn’t news. It reflects much of what we already do. so there isn’t a pressing need to inform test directors.
However, it could be made more informative by providing more detailed examples. What I’m suggesting are not detailed operational definitions or lengthy academic treatises telling us exactly how this code applies to each situation we encounter. Instead I would suggest compiling and recognizing examples of best practices that meet the standards of the code. As a group I think that test directors are more than willing to liberally borrow from good examples that work in schools. Show us how the code is being successfully used in school districts and we will most likely use these examples in our systems.
Sharing best practices would also clarify the intent of the code to test directors. To give you an example, lets look at section 4.1. It states:
"inform examinees about the assessment prior to its administration, including its purposes, uses and consequences; how the assessment information will be judged or scored; how the results will be kept on file; who will have access to the results; how the results will be distributed; and examinees’ rights before, during and after the assessment."
What will this mean in practice?
Does it mean that before each ITBS test our teachers, like our police, should take out a little card and "miranda" our students? Will we have to read something like the following to meet the standards of the code?
The second question I was asked to address is how NCME can help test directors use the code effectively.
NCME members are vastly outnumbered in school systems -- I would guess somewhat less than 1,000 versus millions of teachers, administrators and other professional staff. And the NCME member does not give the tests, teachers do. For example, in my system I have 120 or so administrators, 200 building test coordinators, 4,000 teachers, and 70,000 students. And 1 NCME member.
As you can imagine, when there is only one of "us" and thousands of "them" one is sometimes alone when defending the testing standards we have written into district procedures. When arguing over what our standards should be, and when applying those standards when a teacher goes astray, an old joke about the Lone Ranger often comes to mind. Tonto and the Lone Ranger are trapped in a canyon with arrows filling the air. The Lone Ranger turns to his faithful companion and says "We’re surrounded Tonto. They’ve got us now." To which Tonto replies "what do you mean we, Kimosabe."
What would be very helpful would be to get the related parts into the codes and standards of good practice of other groups. This would enable us to cite not only my professions' standards but that of principals, teachers, school psychologists and others. It would give much more weight and import to NCME's code.
Second, it would be very helpful if the code, or more appropriately a companion document, address how the code works with new forms of assessment. Section 1.1 states "assure that assessment products and services are developed to meet applicable professional, technical and legal standards."
What are these standards when it comes to the new assessments? Many questions come to mind --
Lastly, I was asked what support materials would be helpful. I would suggest the following:
A "Code-Lite’ with just the relevant t portions for classroom teachers. We can then give that away directly or incorporate that into our products such as directions for administering tests and codes of conduct.
Provide model interpretation or informational guides for parents and teachers. The tests we give - NRTs, multiple-choice CRTs, writing samples, portfolios and performance assessments are probably more alike than different. If NCME could provide samples or templates in each category we could then adapt them for our school systems. These could range from videos to handouts (the handouts will be most useful). I think some form of these are probably already out there . And it would be reasonable to think that test publishers could develop, and distribute, such examples with their tests.
Well, that addresses the three questions and ends my 10 minutes.
1985 Proceedings: Cheating on Standardized Tests:Issues and Problems in Large City Schools; Guidelines and Principles for National Test Norming Studies
1986 Proceedings: Legitimate Ways to Prepare Students for Standardized Tests; Taming the Rasch Tiger: Using item response theory in practical educational measurement
1987 Proceedings: Perspectives on Public Reporting of Test Results; Student Achievement Comparison Among States: Issues and status; Riding the Rasch Tiger; The Testing Octopus: A tentacle for curriculum,or, How do you dance with a octopus? ; Who's in Charge of Testing Now? or, A system used to consolidate testing throughout a district
1988 Proceedings: Customization of a National Standardized Achievement Test; Multi-layered Testing in the Los Angeles Unified School District
1989 Proceedings: Beyond the Wall Chart: Indicators and school ranking; Dog-eared Reports are the Best of the Breed: Recognizing and rewarding evaluation utilization; Standardized Tests at the Pre-school Level: Do they tell us the truth about children?
1990 Proceedings: Issues in Large Scale Assessment Programs
1991 Proceedings: More Authentic Assessment: Theory and practice; Measurement Issues in Performance Assessment; Local, State, National and International Indicator Systems: Will we know where we are if we get there?
1992 Proceedings: National Assessment in England and Wales; Alternative Assessments in Practice: Perspectives on Issues and Problems
1993 Proceedings: (Testing...Testing...)Do we know we are going? Have we been here before?? The scoop from P.O.O.P.P's; Objectifying the Subjective: Rubrics, Scoring guides, and Other Ways of Knowing
1994 Proceedings Guiding Principles for Performance Assessment