Accelify has been acquired by Frontline Education. Learn More →

Accelify Blog

Data Looms Large in Quest for New School-Quality Indicator

July 27, 2016

Source: edweek.org

By: Daarel Burnette II

States look hard at what’s required to meet ESSA’s mandate

States scrambling to come up with more nuanced ways to measure school quality under the new federal K-12 law are running smack into an old problem: how to make sure they have the right data.

The Every Student Succeeds Act requires that states—in addition to using English-language proficiency, graduation rates, and scores on statewide achievement tests—add at least one new indicator of school quality or student success, such as school climate, chronic absenteeism, discipline, or college and career readiness.

For many states, adding that new indicator may mean spending more on data systems and collection, avoiding approaches that might demand too much of a data lift, or picking something off the shelf rather than crafting a more challenging indicator, because the information isn’t easily available.

Complicating the matter, the law requires that the data for the new school-quality indicator must be valid, reliable, and comparable across districts, and that officials be able to break out the information by student demographics.

That presents a challenge for state education agencies that want to pick indicators that use classroom observations or teacher and parent surveys to measure schoolwide indicators. Those might include whether parents feel engaged or if teachers are participating in effective peer-mentor programs, for example.

“Here’s a great opportunity for departments to innovate, and they’re being placed right back in a box,” said Mark Elgart, the president and chief executive officer of AdvancEd, a group that’s consulted with education departments to help them create new accountability systems.

But many consultants working with state departments are advising that they not let data-collection issues impede innovation.

“If something’s not feasible to collect, you have to treat it as an implementation issue,” said Joanne Weiss, who was a chief of staff to former U.S. Secretary of Education Arne Duncan and who currently consults with state education departments. “That doesn’t mean it’s not an important indicator that shouldn’t be included in the system.”

Tough Choices

The U.S. Department of Education in June issued its proposed regulations for statesputting together new accountability systems under ESSA, which is due to go fully into effect in fall 2017. The draft repeats the law’s requirements for four mandated academic indicators, as well as for the new “fifth indicator” of school quality or student success.

Heavy Lift

As states revamp their accountability systems under the Every Student Succeeds Act, some are wrestling with what it will take to collect the data needed to incorporate new indicators into those systems.

In the meantime, many states already are wrestling with whether to pick a school quality indicator that is ideal and ambitious, versus one that is practical and safe, with data collection and analysis a major factor.

• California’s education department haspushed back against aggressive effortsby parent and advocates to measure school climate, an indicator that officials say they don’t yet have enough reliable information to measure.

• Connecticut’s education department rejected proposals to add civic engagement to that state’s accountability system—an indicator that would require collecting new data.

• And South Carolina officials, not wanting to trample on the state’s accountability task force’s imagination, will spend more than $1 million to measure school and career readiness as part of its new accountability system. “We know we’re going to be collecting significantly more data with this new system,” said Sheila Quinn, South Carolina’s deputy schools superintendent.

One big issue: whether states and districts are able to retrofit their data-collection systems to answer new and increasingly difficult questions, a potentially arduous and expensive task.

For many measures, state officials say they lack the infrastructure to collect enough reliable information to attach high stakes. Many districts’ data-collection sytems are scattershot and outdated. Scores of technicians responsible for processing data have been laid off in recent years amid budget cuts. And local superintendents have complained that they’re already required by states to collect an inordinate amount of data.

The details are daunting. Scott Norton, the Council of Chief State School Officers’ strategic-initiative director for standards, assessment, and accountability, said pulling all the right data together requires syncing districts’ systems, then coding those systems to collect the right information.

Some data points, such as whether a student is a foster child or part of a military family, are pretty straightforward. But others—such as how students feel about a school’s climate or whether teachers are receiving a certain amount of professional development—may require a bevy of surveys that then must be manually entered into the database.

As a result, many education departments, depending on their capacity, will consider outsourcing the work or paying millions of dollars to purchase entire new systems, consultants say.

Student-Level Information

There’s also the sheer volume of information. School districts today collect hundreds of thousands of data points about children that are often stored in large data warehouses. Students track their academic progress in data binders, teachers tweak their curriculum based on rapid-fire online quizzes, and principals tally office referrals to craft new discipline procedures.

Against that backdrop, Brennan McMahon Parton, the Data Quality Campaign’sassociate director for state policy and advocacy, has traversed the country in recent months urging state education departments and lawmakers to evaluate data they already collect before deciding to collect more as they weigh new school quality indicators.

“Many states have meaningful and useful data in their system already,” she said. “That’s not to say with a push of a button, you get what you need.”

In Connecticut, more than two-thirds of local superintendents said in a 2012 survey that the amount of data the state requires them to collect was duplicative, burdensome, and costly. That year, Democratic Gov. Dannel Malloy signed an education bill that tasked the state department to reduce by a third the number of data forms districts annually fill out.

So when the education department formed a task force two years ago to construct a new accountability system, superintendents and the agency pledged that any new indicators would have to be based on information the department already collected.

“Oftentimes, when the state asks for new data, we tell them we already have it or we’ve been giving it to you in other ways,” said Joseph J. Cirasuolo, the executive director of Connecticut’s superintendents association. “Usually, it’s not where it has to be.”

In the end, Connecticut decided to add access to arts courses, chronic absenteeism, career readiness (based on students’ performance on the state’s achievement test, SAT, ACT, Advanced Placement, or International Baccalaureate tests), schools’ college-entrance rates, and three new ways to measure graduation rates to its accountability system.

Big Price Tag

In South Carolina, the task force designated to come up with a new accountability system decided to collect information on elementary, middle and high school readiness and career readiness. That state’s districts all collect data using separate systems, many with different contractors. Definitions of indicators such as chronic absenteeism or what qualifies as a suspension vary widely.

In order for South Carolina to measure the new indicators, the department will spend more than $1 million to buy a new collection system that pulls data points from each district’s systems.

“We collect attendance, but the question is: What is the quality of the attendance data that we receive?” asked Daniel Ralyea, the director of the state education department’s office of research and data analysis. “I can aggregate it at the state level, but what happens is, in practice, elementary schools may not be as concerned with recording attendance as high schools are.”

And in California, the debate over whether to use school climate as an indicator involves such factors as classroom observations and a host of student, parent, and teacher surveys.

“We think (those surveys) are used best at the local level,” said Keric Ashley, the deputy superintendent of California’s education department, pointing out that the data are prone to errors.