Data Does Not Tell the Whole Story
Recently, a succinct 1-67 ranking of all school districts in Florida was released to citizens and the press by the Florida Department of Education (FDOE). These rankings were based upon a series of calculations directly tied to the administration of the Florida Comprehensive Assessment Test (FCAT) in each of Florida’s 67 Counties. Some districts, like St. Johns (south of Jacksonville) and Santa Rosa (#1 and #2 respectively) have much to be proud of, as they ranked very highly. Hats off to them for the great work they are doing.
For districts like Escambia (ranked #44), the data is much less palatable.
Are we to assume that because districts like Santa Rosa and St. Johns are ranked so high—that they are exponentially better at learning delivery than their lower ranked peers like Escambia? [CAUTION, HUMOR AHEAD] I actually think that if Escambia swapped teachers with Santa Rosa, and kept everything else equal, Santa Rosa Would be holding the #1 slot! [INSERT: RIM-SHOT] Okay, maybe that’s a bit simplistic and over the top, and was of course stated in jest, but according to the press portrayals of this FDOE data I have read—one could easily reason that the higher the district ranking the better the School District, period.
And therein lies the problem.
The release of data like this without thoughtfully conceived disclaimers, explanations, and footnotes can and does lead to incorrect and negative public perceptions. This is because data alone does not tell the whole story. And Commissioner Gerard Robinson’s 5 second blurb about poverty being a factor during his 2 minute introduction of the data does not cut-it as a disclaimer. Nobody (except me) watched that video.
Complex data should be carefully developed and the “press-release” of such data requires thoughtful deployment If accuracy is valued. Apples should be compared to Apples.
A striking yet very apt analogy is the community crime statistics released yearly by the FBI.
When statistics about murders, assaults, forcible rapes and other vicious crimes are released annually, these data are put into tables and organized by events per 100,000 citizens of a particular community. For example, (Utilizing 2010 data) the murder/non-negligent manslaughter rate in New Orleans (pop. 356,000) was roughly 50 times higher than it was in either El Paso, Texas (pop. 624,000) or Lincoln, Nebraska (pop. 260,000). Does this mean police departments in El Paso and Lincoln are way better than the Cops in New Orleans? Of course not. This is why the FBI takes great care in providing a carefully worded disclaimer on its website along with the yearly crime stats. Community issues, poverty, demographics, and a myriad of other social ills affect crime rates—and educational outcomes.
So I propose that the next time the FDOE wants to release rankings, perhaps they should consider utilizing the following disclaimer (taken directly from the FBI website, School District substituted for Law Enforcement, Educational Failure substituted for crime)
“Individuals using these tabulations are cautioned against drawing conclusions by making direct comparisons between cities. Comparisons lead to simplistic and/or incomplete analyses that often create misleading perceptions adversely affecting communities and their residents. Valid assessments are possible only with careful study and analysis of the range of unique conditions affecting each local School District jurisdiction. It is important to remember that Educational Failure is a social problem and, therefore, a concern of the entire community. The efforts of a School District are limited to factors within its control. The data user is, therefore, cautioned against comparing statistical data of individual agencies.”
No comments:
Post a Comment