Skip Navigation
small NCES header image
The Nation's Report Card — Results from the 2005 NAEP assessments in reading and mathematics
Dr. Peggy G. Carr Hello, and welcome to today's StatChat on the NAEP 2005 reading and mathematics reports. I hope you've had time to examine the results. There are many findings in today's reports and I'm interested to hear which ones you want to talk about. So, let's get right to your questions?

Richard from Madison, Wisconsin asked:
While the math NAEP results have gone up, the TIMSS math results have not gone up in fourth grade and there has not been much improvement for eighth grade. What has been done to reconcile these two findings?
Dr. Peggy G. Carr : You are correct that TIMSS did not show an increase in mathematics at grade 4 between 1995 and 2003, and only a small increase at grade 8. While NAEP also shows a small increase at grade 8, there has been a stronger improvement at grade 4. One point to keep in mind is that these assessments were given at different times--we are comparing NAEP 2005 results with TIMSS 2003 results. Also, the two assessments use different frameworks and different formats. In 1999, researchers worked to link the NAEP results to the TIMSS results statistically. You may find that report under NCES publications at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=200501.

Leonie from NY,NY asked:
Do we have the results for large urban districts like NYC yet? If not, when will those results be available?
Dr. Peggy G. Carr : Results from ten large urban districts that participated in NAEP's trial district-level assessment program will be released in late November or early December. Separate mathematics and reading reports will document 2005 performance as well as trends that go back to 2002 for reading and 2003 for mathematics.

Judy from Lansing, Michigan asked:
What makes a significantly different score? For example, I want to use the info that my state's eighth grade math scores are lower than 24 jurisdictions, but I'd prefer to say it's lower than 23 states and Department of Defense schools, except I can't be sure of that info from the charts. Thank you.
Dr. Peggy G. Carr : Whether two scores are significantly different is determined by conducting statistical tests. In the case of state and/or jurisdiction comparisons, a lot of these tests are conducted simultaneously. As far as your specific question: go to http://nationsreportcard.gov and click on "State Comparisons" in the left-hand navigation menu. On top, you can select the "Mathematics" tab, and on the second row of tabs you can select "Average Scale Score Map." You will see a map and can click on Michigan. Now, Michigan is the focal state. You will now see that the DOD schools have higher scores than Michigan.

Eduardo Alvarez from Washington DC asked:
When will the public see the taxpayer-funded HLM analysis of NAEP data that looks at public school student achievement versus private school student achievement?
Dr. Peggy G. Carr : The HLM report is in the final stages of NCES' extensive review process. Based on that, the report should come out early next year.

Richard from Knoxville, Tennessee asked:
The NAEP results are often reported in the media by ages (9,13,17 yrs old) but are these ages accurate? Are 4th and 8th graders, for instance, older today than they were in the 1970s or 1980s? Has the recent increase in retention in grade muddied the NAEP results? In other words, are we now testing 11-year-old 4th graders rather than 9-year-olds? Have the 2005 4th or 8th graders completed more years of schooling than previous cohorts because of the increases in retention in grade? For instance, Florida now mandates retention of any 3rd grader not achieving the cutoff on FCAT reading assessment. Some 30,000+ students annually are retained and almost half that number spend a third year in third grade. Florida had a significant rise in NAEP scores on the last assessment but then 30,000 low achieving students had been removed from the NAEP 4th grade testing pool. Has the NAGB attempted to account for this policy and population shift?
Dr. Peggy G. Carr : Richard, Thank you for your question. The NAEP samples show an opposite trend than the one you hypothesized: the percent of students above the modal age has remained relatively stable over the years and is actually lower in 2005 than in the early 1990s. Explore the new NAEP Data Explorer (NDE) for more information on this topic.

Elizabeth from Arlington, Virginia asked:
Can we really tie No Child Left Behind to these scores? Flat reading scores and upward trends in math are nothing new. How can we chalk these scores up to NCLB, positively or negatively?
Dr. Peggy G. Carr : Elizabeth, NAEP data are of particular use to policymakers, because it provides reliable information about students' achievement--indicating whether or not we are meeting our educational goals. As a large-scale assessment survey, however, it is not designed to answer causal questions--or explain why results look the way they do.

Yami from Milwaukee, Wisconsin asked:
What would the results for the Hispanic children tell you? Should there be more of an emphasis in bilingual education to catch children up that learned Spanish first?
Dr. Peggy G. Carr : NAEP is not designed to answer your question about bilingual education. However, I can say that Hispanic students have made some of the largest gains of any student group. In addition, the White-Hispanic gap has continued to narrow in our recent assessments. Also, the NAEP assessment is more inclusive, for both students with disabilities (SD) and English language learners (ELL). Therefore, more information is available for researchers interested in investigating such issues.

Jack Turner from Chicago, Illinois asked:
There was talk about the grade 12 NAEP being required or given so that state results would be available. I haven't heard anything lately, is that still on track?
Dr. Peggy G. Carr : The National Assessment of Governing Board (NAGB), which determines what assessments will be administered, has discussed this issue at length. It was among several recommendations put forth by a special commission convened by NAGB to examine issues related to the 12th-grade assessment. At this point, NAGB is considering whether or not to recommend state-by-state at grade 12 on a trial basis. It is likely to be discussed further at the upcoming November NAGB meeting. In the end, however, it will be a decision made by Congress if it is to materialize.

Gloria from Bridgewater, Massachusetts asked:
The scores for mathematics continue to make gains. How much of this gain do you attribute to effort of school districts to implement standards based learning?
Dr. Peggy G. Carr : Gloria, We simply can't be sure how much of the math gains can be attributed to such efforts--the design doesn't allow this.

Aileen from San Francisco, California asked:
I'm told that there are NAEP results for Los Angeles and San Diego. Why are there no results for San Francisco, or do you plan to test all large cities?
Dr. Peggy G. Carr : NAEP initiated a trial district-level assessment in 2002. A range of urban districts were recruited to participate so that we could test the viability of this new NAEP component in a variety of settings. Participating districts have responded enthusiastically and expansion will be considered if additional funds become available.

Don from Palo Alto, California asked:
The significant grade 4 reading gains from 2003 to 2005 published by NAEP for Arkansas, Louisiana, Pennsylvania, and Massachusetts are suspect, because NAEP exclusions of students with disabilities and English language learners increased in these states. Does NAEP plan to publish corrected achievement trends soon?
Dr. Peggy G. Carr : NAEP has published, this morning on the web, the investigation of the potential impact of changes in exclusion rates that you speak of. See http://nces.ed.gov/nationsreportcard/about/2005_effect_exclusion.asp. The role of similar types of studies in the official NAEP results are under consideration.

Neil from Dickinson, North Dakota asked:
Please give the meaning of the numbers. Are they percentages,etc?
Dr. Peggy G. Carr : NAEP reports two types of numbers: scale scores and percentages. Scale scores are typically on a 0-500 scale, 500 being the highest. The National Assessment Governing Board (NAGB) sets cutscores on this scale to indicate Basic, Proficient, and Advanced performance. Subsequently, percentages of students are reported for below Basic, at or above Basic, at or above Proficient, and at Advanced. Hence, both scale scores and percentages can be found in the report (see http://nationsreportcard.gov). On this website, you can also find item maps. These maps list items from the assessment and indicate the score level at which a majority of students with that score answers the item correctly. In other words, an example is provided of typical tasks a student can do successfully with a certain score.

Mario from Ely, Minnesota asked:
Given that 1) great new efforts have been expended in recent years to help improve student achievement in math and reading, and 2) there's little to show in terms of significant improvement, is it possible that the problem is not with the curriculum content but rather with teaching itself? Given that the pool of teaching talent is historically drawn from colleges of education that (1986 Ken Keller Report, U of MN) score 2nd from the bottom (in terms of GPA scores) of all colleges at a given University, then I should like to pose the following question: would smarter teachers help to produce smarter students i.e. improve student achievement scores?
Dr. Peggy G. Carr : Mario, You have asked an interesting and important question. Much research has been conducted on the topic of teacher participation but no definitive answer has been found. If you are personally interested in pursuing this research question, the new web-based NAEP Data Explorer (NDE) can be used to examine how teacher background variables are related to student achievement. However, these data analyses will not allow you to make causal inferences, i.e. that better-prepared teachers cause students to learn more.

Scott from Houston, Texas asked:
It appears that Asian/Pacific Islander in the 4th and 8th grade outperformed all other ethnic groups in math and and share the top performance rating with whites in reading. Is that correct and if so, do you have any suggestions about why this might be?
Dr. Peggy G. Carr : You are certainly right that Asian/Pacific Islander scores are among the highest of the racial/ethnic groups. You might also want to do statistical tests on the NAEP Data Explorer (NDE), and verify to what extend Asian/Pacific Islander scores are significantly higher than other racial/ethnic groups. However, reasons why there are differences are best left to policymakers and other experts.

Don from Palo Alto, California asked:
State NAEP achievement trends are grossly distorted by fluctuating rates of excluding students with disabilities and English language learners. A third of all students with disabilities are not represented in NAEP grade 4 reading achievement results. When will NAEP publish trends that correct for the exclusion bias?
Dr. Peggy G. Carr : Don, NAEP reports its results for students who are capable of being tested and can be found in school. Students who can't be found in schools, such as those schooled at home, chronically ill students, students in hospitals and correctional facilities, as well as students, who can't be meaningfully tested are not part of the student population to which the sample results can be inferred. To the extent that the inferences are drawn to a population that was expanded to include untestable students there can be the "distortion" you are referring to. However, I would not characterize the distortions as "gross." For the 52 states and other jurisdictions the correlations between changes in exclusion rates and average score gains were low to moderate (.48 and .19 for grade 4 and grade 8 mathematics, and .63 and .40 for grade 4 and grade 8 reading). Changes in inclusion explain only a small part of scores gains.

Kyle from New York, New York asked:
Will you be releasing the number of students tested in each state?
Dr. Peggy G. Carr : Kyle, the number of students tested in each state (rounded to the nearest 100) is located at http://nationsreportcard.gov/reading_reading_math_2005/s0095.asp. Typically, NAEP assesses a representative sample of 2500-3000 students per state.

Pam from Vineland, New Jersey asked:
Are the number of students tested in New Jersey in each of the grade levels and subjects tested available right now? Where can i find it? If not, when will that information be made available?
Dr. Peggy G. Carr : Pam, please refer to the answer to Kyle's question. Thank you.

Scott Welch from Philadelphia, Pennsylvania asked:
Dr Carr, From the NCES data I have seen on reading results scores the past four years were: 4th grade-2005 219, 2004 219, 2003 218, 2002 219
8th grade-2005 262, 2004 259, 2003 263, 2002 264
With all the additional money the Federal Gov't is granting to states for SES programs it would seem that more improvement would be visible. Instead there are what could be termed flat graphs. However, as there are school districts (not just in PA) that seem to be reluctant (to be polite) to allow SES Providers that opportunity, where does this leave those of us who really want to assist children to improve their reading skills well above Basic, into Proficient? Is there perhaps a Federal monitoring department that actually ensures that school districts do what they are supposed to as it is Federal money supposedly going to these programs?
Dr. Peggy G. Carr : Scott, the NAEP mathematics scores have show continued improvement since the first assessment in 1990. So, I guess you are referring to the reading results. It is true that the overall reading scores are relatively flat, but there have been gains among some groups, such as the 4th grade blacks and Asian/Pacific Islander students. As to your question on monitoring projects that use Federal funds, there are agencies within the Department of Education that perform this task. For more information check the ED website at www.ed.gov.

Marilyn from Tampa, Floirda asked:
What percentage of test takers represent private schools and are private schools from every state represented?
Dr. Peggy G. Carr : The representation of private school students is commensurate with their prevalence in the population. Estimates from the NAEP sample are descriptive of the nation as a whole.

Thanks for all the excellent questions. Unfortunately, I could not get to all of them, but please feel free to contact NAEP staff or myself for more information. I hope that you found this session helpful and the reports interesting. Please visit the web site in the coming weeks for more information on the release of results from the trial urban district assessment (TUDA) in reading and mathematics.

Back to StatChat Home



Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.
National Center for Education Statistics - http://nces.ed.gov
U.S. Department of Education