Sunday, April 19, 2015

Looking at Disproportionality

The concept of disproportionality underlies much of the reform movement for education in the United States. Sometimes referred to as the achievement gap or opportunity gap, the basic idea is that outcomes for all students are not equitable. While much of the conversation focuses on race, disproportionality applies to any subgroup: gender, special education, English language learners, free/reduced lunch, and so on. Over the past several years, much of the conversation about disproportionality has focused on student achievement---and test scores in particular. But just as there is more we need to look at in addition to race, there is more to children than representing them as test scores. We can look at disproportionality as it relates to sports or after school activities, student discipline,  attendance, and other factors.


If you want to examine disproportionality within your system, there are a few pieces of data that you will need to know. In the example below, I'm going to use gender (male, female) as subgroups. (Note: I realize this is a very heteronormative view of gender. Our data systems need to catch up with our increasing understanding as a society about gender identification; however, right now, most school data systems are set up to only capture the binary male/female...so I'm going to use it as an example.)

First, you need to know the enrollment numbers and percentages for each subgroup. In other words, how many possible students are there who could participate in an athletic program, be subject to suspension/expulsion consequences, fail Algebra, or yes, pass the state test? Many schools report gender as close to 50/50 percent, as one might expect, but variations do exist. Don't assume that you're starting off with equal pools of participants.

Secondly, you need to know the participation numbers and percentages for each subgroup. Just because everyone is eligible to pass the state test doesn't mean that they do. So, how many males/females met the standards? How many in each subgroup were suspended? Enrolled in Physics or Calculus? Turned out for basketball?


In this example, we have a school with 250 males and 275 females, with 50 from each group enrolled in Algebra. Now we need to calculate the disproportionality.


To determine the number of males required to achieve proportionality for the total population, we use the first equation described above (n males for proportionality = (50 * .52) / .48) for a result of 45.5 males. The second equation gives us 55 females needed for proportionality.

Next, we take these two and compare them with the number of students in each subgroup that are participating. For males this would be 50 - 45.5 = 4.5; for females 50 - 55 = -5. That -5? It means that we need five more females enrolled in Algebra to achieve proportionality.

While it may not be entirely realistic to achieve perfect proportionality within a system for all programs, subgroups, and outcomes, it is still important to review these data to reflect on areas where institutionalized racism or policies may be contributing to disproportionality. Another factor to consider is the size of the subgroups that you are reviewing. For example, if you only have two or three American Indian students in a grade level, it's unreasonable to expect that they are represented in everything---but you should look to be sure that they are represented somewhere among school offerings. In that case, it may be more helpful to use longitudinal data to get an idea for participation.


Here's an Excel workbook that allows you to easily compare gender equity in sports programs. I built it a couple of years ago for a program that needed it, based off an idea of Debra Dalgleish. See her site for even more ideas on data entry forms...and feel free to modify mine to suit your needs.

Monday, April 13, 2015

Session Recap: Data Displays as Powerful Feedback

I had the pleasure of presenting at the ASCD annual conference last month. Each year, I stretch myself a little further in making connections between ideas, as well as between technology and content.

My session description: Developing visual literacy is a key skill for student success with Common Core State Standards. Students also need clear feedback about their progress. Using data displays, such as charts and graphs, we can integrate these goals and increase student achievement. In this interactive session, you will learn strategies that increase visual literacy and foster communication. You will also learn to effectively use data collected in classrooms as feedback with students. Both digital and analog tools for organizing and integrating data into lessons will be provided.

Session descriptions are written at least 10 months before the actual presentation happens. This extended timeline can explain why many sessions are not as promised, which is very frustrating for attendees. You pick something from the catalog that looks like it is the most perfect thing ever, only to show up and discover that the presenter has something different in mind. I try to stick as closely as possible to my submitted description, but I admit that I end up taking a little birdwalk here and there. It's hard not to---you learn so much in between submitting a proposal and actually presenting it. For me, a lot of that growth in learning has occurred to changing jobs this year and getting a much better on the ground view of the lack of visual literacy among students and teachers.

My logic model that framed my presentation was


I started the presentation with a brief look at visual communication in general---pictures have been used far longer than text. Then, we talked about how graphics used as feedback have a larger impact on student achievement than nearly any other type of feedback (e.g., marking answers right/wrong). All of this was to build a case for becoming visually literate.

I won't bore you with all the details. I fused together some previous presentation materials and pulled a lot of pieces of this blog in as examples. But if you want a look at things, I have it all stashed on the same wiki as my other resources.

I was slated for 8 a.m. on the last day of the conference---not quite the worst possible time slot, but just about. So, I had a small, but awesome crowd. Lots of comments afterward made me feel good, from one gentleman who said it was the best hour of the entire conference (and asked if there would be a Part II) to another with a very heady offer I'm kicking around.


Proposals for next year are due in a month, so I am already kicking around things to share. I think that I will put in something about using questions to focus data use...and something similar to this year on visual literacy skills. We have to expand our conversation about visual literacy. We work so hard in schools to be literate in other ways. We practice rules for grammar, punctuation, and different forms of writing...all with the goal of improving communication. But for the most part, the visuals developed are junk. And that needs to change.

In the meantime, there is SO much I want to learn. I would love to try and go to the Eyeo Festival next year. Or somehow wrangle an invitation to the Tapestry Conference. I'm feeling a need to get beyond the borders of education and exchange ideas and resources. I continue to do lots of reading and thinking (no matter how quiet I am here) and am always pondering what to learn next. Isn't that what we want for our schools, too?

Sunday, March 22, 2015

ASCD 2015: Data Tools

This is the third year that I've been on the hunt for high-quality data tools in the Exhibit Hall at the ASCD annual conference. The first year (2013) was downright depressing. Last year was better---I ran across a couple of promising tools, although neither are represented at the conference this year. Here are the trends I'm seeing this spring.

Data Capture
This is a brand new theme this year. I saw three different tools yesterday that are meant to support teachers in recording student conversations or other "in the moment" data points and then associate those with a gradebook or spreadsheet. I am intrigued by these. I think their benefit may be somewhat limited right now. Teachers would need to be outfitted with tablet devices and know how to seamlessly integrate those with their classroom work. I suspect that more and more teachers fit that description each year, but my school district is not quite there yet. One thing that I really like about these tools, however, is that they put the power of assessment back into the hands of teachers. In an age where we large-scale district and state assessments carry the weight and propel the discussions, these tools give teachers another way to show student learning. Yes, these demonstrations were always there, but now there are supports for teachers to share the very important daily learning with others. The best tool of the bunch? Just open Sesame.

Item Development
With the advent of new online assessments in many states, such as Smarter Balanced and PARCC, there is a new emergence of tools that allow classroom teachers to build items and assessments that have many of the features of their large-scale brethren. SchoolCity was the first one I saw last year, but there are a couple of new players showing their wares at this year's conference. The best one of these is Edulastic. Most of these tools promise integration with your gradebook or data warehouse. I think we have to be cautious, however, that just because you can make all sorts of new-fangled items for kids to answer doesn't mean you should. If your goal is just to have kids practice responding to particular sorts of items (e.g., drag-and-drop) for "The Test," then I hope you'll think a bit harder before purchasing this kind of software. We also need to support teachers with the basic assessment literacy required to write good items to measure student learning. We haven't done that much with paper/pencil tests---and online forms mean we have even more background knowledge to build.

Design Is Better
Most of the tools I saw yesterday show some thoughtful design. As a whole, they're far better than they were two years ago, but there is still a long way to go. I didn't find a single vendor who uses a data designer for their displays---they all depend on developers to code whatever charts and visualizations they have. Some claim that their charts are "designed by teachers." This is also a bad answer. Teachers and other educators should definitely have input---they are the end users for the tool, but they are not data designers. Listen to the stories classroom experts need to have told, then create the interface to communicate in a powerful way.

One vendor has the biggest, ugliest, exploded 3D pie chart on their screen. I asked them why they had chosen it. The rep wasn't sure. I probed further: Why is it in 3D when you only have two dimensions of data? His reply: Because it looks cool. No, honey...it doesn't.

We have to demand better from our vendors.


If you're in Houston in the next day or so, swing by the conference and check out all the new data tools for the classroom. Or, if you've already wandered through the vendor area, feel free to leave your new favourite option in the comments.

Saturday, February 28, 2015

Neither Smart nor Balanced

I'm on a crusade against bad data viz in education. For an industry so consumed with being "data-driven," we don't display it very respectfully. Vendors out there don't help. Instead of hiring data designers to develop visualizations that support educators in getting the most information from their data, we instead get things like this:

from http://www.smarterbalanced.org/news/smarter-balanced-states-approve-achievement-level-recommendations/


Can we talk about this chart from Smarter Balanced?

Here's a brief list of things that make this chart difficult to use:
  • The title is centered. In the western world, we start reading at the top left. When you change that for a chart (or slide or document) then the audience has to use time and cognitive processing to figure out how things are organized...leaving less "head space" to interpret the information.
  • The y-axis label needs to be turned 90 degrees so we can read it easily.
  • There is way too much ink here in relationship to the data. We don't need tickmarks on the outside of the chart. The "%" signs in the stacked bars are redundant---we've already said they're percentages in the title. Axis titles don't need to be bold. We also don't need the y-axis scaled by 10.
  • The colors are awful. Not only do we have the ubiquitous red-yellow-green, which is not very accessible for all vision types, the shades selected here are particularly gaudy. Are we clowns? Do we amuse you?
  • Finally, humans do best at judging length and position (out of all the pre-attentive attributes). This chart takes advantage of neither.

So, let's fix those things.

 Okay, we're getting there...but we still have a problem. It's very difficult to make any comparisons among the groups represented. This is an issue common to stacked bars---in either vertical or horizontal forms.

What if we line up the bars at the division between not meeting and meeting standard?


When we "float" the bars on the chart, we not only have all the information we need, we can more easily dig a little deeper for comparisons between grade levels. (If you like this sort of chart, there's a tutorial on Ann K. Emery's blog.) I could probably improve this one a bit by playing with the x-axis, this one is still a vast improvement over the original...if I do say so myself.

Here is a before and after comparison---same data, different presentations:



So, Smarter Balanced, what do you say? Can you start making an effort for us and get some meaningful data visualizations to go with the new system?

You can also download my workbook for this chart.

Friday, March 28, 2014

When Good Data Go Bad

Not that long ago, we talked a bit about data quality---attributes of your data that describe its "truth." A recent post over at Education by the Numbers caught my eye, because it speaks to the possible affects of poor data quality. The post pulls out the following quote (emphasis added by me):

The federal report found that barely half of Georgia’s high schools offered geometry; just 66 percent offered Algebra I.

Those data are just plain wrong, said Matt Cardoza, a spokesman for the Department of Education. The state requires Algebra I, geometry and Algebra II for graduation, so all high schools have to offer the content — but they typically integrate the material into courses titled Math 1, 2, 3 and 4, Cardoza said. He surmised that some districts checked “no” on the survey because their course titles didn’t match the federal labels, even if the content did.

“It’s the name issue,” Cardoza said. “I think schools just didn’t know what to say.”

Ah, data quality has reared its ugly head...and now everyone is freaking out about the perceived inequity of math offerings.

So, here's the deal. Suppose you're a working at a high school. You offer an algebra class...but you might not call it Algebra I. You might call it just plain Algebra. Or Algebraic Thinking. Heck, you might even have Advanced Algebra or Honors Algebra or 9th Grade Algebra. At the school level, this distinction doesn't really matter. The school has a master schedule, assigns highly qualified teachers to whatever sections it has that they identify as math. When a new student shows up and needs a math credit, everything in the student information system enables the placement.

But that isn't the end of the story. There's another layer of data that few---maybe just the registrar or district data manager---will ever see. There's a whole taxonomy of course codes determined by the National Center for Education Statistics. These course codes are collected by the state and are part of the district student information system. But because the district doesn't use them for anything---remember, they have their own labels---not many pay attention to what fills those fields.

Here is one example (click to embiggen):


These are the math classes for Bellevue High School for the 2012 - 13 school year. (data source here). Columns 4 - 6 include state course labels---the invisible ones---and 7 and 8 are designated for the district. So, let's dig into the last row ("Mathematics-Other") and see what the district is lumping in there.


Notice that in the second column from the right---District Course Title---we have things like Alg I Seminar, Gmtry Seminar, G-Alg 1 Seminar. We can't see the syllabi for these classes, but it's likely that algebra and geometry concepts are being taught. Kids are getting math credits and are being scheduled into math classes, but a data pull at a state or federal level will never see these.

It gets worse. Start digging through "miscellaneous" categories, and you start to see things like this:


The state course code on the left says English Language and Literature-Other...and the district has assigned biology, chemistry, physics, nanotechnology, and more to this category. Even assuming these are courses for English language learners, special education students, or other population, it's still science content---it doesn't belong in English. At this level, data quality is a real mess.

But what to do? After all, it doesn't make a difference to the district. They have their own codes and credit systems. It does make a difference to anyone outside of that system. It's public data. Anyone can use it for any reason---from bureaucrats trying to make decisions about allocations to think-tanks sounding the alarm about equity.

All of this mumbo jumbo doesn't mean that schools with larger minority populations aren't being underserved. Considering the other ways we shortchange these students, it doesn't seem unlikely that access to a rigorous curriculum would be on the list. I suspect, however, that noise in the data quality is hiding the true signal.

I tell teachers all the time that paying attention to data quality is the simplest way to have a direct effect on policy. You might not think that attendance you took in first period matters...but it does. As it rolls up, districts will make decisions about how they make resources...states will consider policy. How many absences before a student should be considered "at-risk"? What strategies work best to improve attendance rates? What should be the legal consequences for students or parents when kids don't attend school? All the little pieces of data matter. If you want better policy, we need better data quality.

Monday, March 24, 2014

Lookin' for Data Love

I recently attended (and presented) at the 69th annual ASCD conference. If you're unfamiliar with the acronym, ASCD is one of the largest and oldest professional organizations for educators. The letters used to stand for the Association for Supervision and Curriculum Development, but that definition was scuttled a few years ago when the organization outgrew its original boundaries. The acronym was kept for branding purposes.

You might remember that I went hunting last year for data tools in the exhibit hall. I did so again, along with attending a couple of presentations on how data is being used in schools. So, here's the wrap-up.

Vendors
I looked at three different tools in the exhibit hall. None were quite to the level of a student information system, but all integrated assessment and performance data. Two are not worthy of further discussion (one, in fact, admitted that they do no testing/accommodations for accessibility).

A third, Schoolzilla, didn't totally blow me away; however, they are using Tableau to build their reports. The reports follow the Shneiderman mantra of "Overview first...then zoom and filter...details on demand." To be fair, I don't expect any product to knock my socks off in an exhibit hall setting and where I spend <5 minutes at a booth. However, Schoolzilla may be worth a more in-depth look, if your district is on the hunt for that sort of thing.

I also spent a chunk of time at another venue chatting with a rep from SchoolCity. Their product is recently undergoing a complete redesign, but I got a behind the scenes peek at things. I suspect that this product may well be worth a second look in the coming months.

Presentations
Again, my socks remained firmly on my legs. Okay, so they were imaginary socks---the conference was in Los Angeles and it was too warm to wear such things---but let's just go with the metaphor here. The presentations I attended were focused on sharing how a particular school or program was using data. The common thread among all these was that no one starts with a question---and I find this worrisome.

Tweet by Science_Goddess: Data use that doesn't start with a question worries me.
https://twitter.com/science_goddess/status/444960747974455296


While it's good practice for student assessment to guide the next steps in teacher instruction, it is impossible to use every single piece of data derived in the classroom. We have to focus---we need to be picky about where we dig. I know it isn't as simple as that. The hardest part of any analysis is that very first step: Asking a good question---the one most worth asking.

Switching it up for final session for today. "Grab a shovel: Data digs to drive student achievement." (Don't shovel and drive, kids.)
https://twitter.com/science_goddess/status/444955206690684928

I was pleased to hear one presenter talk about how too many teachers see the purpose of data as sorting and selecting. I became worried, however, when she mentioned how "all the data is spewed across the wall in the War Room." My colleague often says that we have to get beyond admiring the problem. Data can be used in strategic ways, to be sure, but that means respecting what we collect and being thoughtful about how we move forward with it. There is something troublesome, for me, in any terminology that involves spewing and war.

As for me, my presentation went well enough---I even ran short, although no one complained about getting away early. :)  The next morning, this happened (well, after the earthquake):

Tweet by @science_goddess: Just got asked "Aren't you the data woman?" Why, yes...yes, I am."
https://twitter.com/science_goddess/status/445559513274273792


This data woman hopes to see you at the 70th annual ASCD conference in Houston next year!

Friday, March 14, 2014

Data Sharing for Good

On Sunday, I'm presenting a session at the ASCD annual conference on using data in the classroom. Along the way, we'll take a tour of chart selection for a data set, some best practices for data viz, and tools for moving beyond "admiring the data." Materials for the session, along with other sundry data viz resources are here.


I've tried to pull from a variety of sources...a few favourite quotes sprinkled in, like this one:

We'll play a little game with the When Excel Gives You Lemons data. You can play, too, by clicking here to choose a bachelor from the three below. (Turn your speakers up for the full effect.)

We'll explore a few different kinds of stories...and the problem with pie charts.


A detour to Sesame Street will help us think about pre-attentive attributes.





So, if you're in Los Angeles this weekend, stop by and learn with us. If you have ideas and resources to share, leave them in the comments.