May 17th, 2021 • Erika Tyagi, Poornima Rajeshwar, and Liz DeWolf
UPDATE: Data Reporting & Quality Scorecard, Round 3
In March, we released our Data Reporting & Quality Scorecards assessing the transparency of each of the 53 major state and federal agencies. We provided grades for each system evaluating the breadth of COVID-19 data each reports (our data reporting metrics) as well as how effectively each reports those data (our data quality metrics).
On April 12th, we reassessed the scores and grades for each agency to allow for changes made in data reporting practices as well as to update our own criteria. A third round of assessments took place on May 10th.
In this round, 83% of agencies failed. This is up from 81% in Round 2, and 75% in the initial round. Click here for the raw scorecard data for all 53 agencies.
As in Round 2, the grade changes were a result not of changes in data reporting practices, but in our decision to add two new scorecard metrics this month, which added four points to the total possible score (now 32 points) for each agency. The two new metrics assessed whether agencies reported active cases for staff and total staff population. It has become particularly critical for agencies to report these data points as we continue to learn of low vaccine uptake among correctional staff in several states. As large numbers of staff refuse vaccinations, knowing the current state of the virus among this population, which can serve as a vector to transmit the virus to incarcerated populations, is essential. These two variables are necessary to do so.
Key changes:
In the third round of scoring, two agencies’ grades improved: the California Department of Corrections and Rehabilitation, which reports both active staff cases and staff population, improved from a C to a B, and the Pennsylvania Department of Corrections, which recently reinstated its COVID-19 dashboard after taking it offline in January, improved from an F to a C. Grades for the state correctional agencies in Minnesota and Wisconsin dropped from Cs to Ds because neither agency reports active staff cases nor population. The grades for the agencies in North Carolina and Oregon dropped from Ds to Fs for the same reason.
The agencies in Florida and Mississippi each lost points for decreasing the frequency with which they their dashboards. The former has dropped to biweekly updates, while the latter has not updated its dashboard since April 23. The Ohio Department of Rehabilitation and Correction lost points for the “cumulative cases” metric because we recently learned that the agency removes cases from its overall case count whenever a person is released. The reduction of points did not affect the letter grades for any of these agencies because all three had already failed.
We once again note that the scores we have assigned to the agencies do not reflect our judgment on how each has managed and responded to COVID-19 inside prisons, nor how accurate we believe the reported data to be. Under-testing may create gaps between the data as it is reported and the reality on the ground. The grades we assign only reflect our judgments on how comprehensive the data reported are and how well they are presented.
Scorecard for the California Department of Corrections and Rehabilitation
About the Metrics
Data Reporting
Our metrics for data reporting are tied to the 12 key variables we aim to collect from each jurisdiction. Out of these, six relate to incarcerated people and six to correctional staff. We have previously outlined why, at a minimum, all correctional agencies should report COVID-19 cases, deaths, and tests for incarcerated people and staff, and also why the reporting of real-time facility-level population data is essential. Knowing how many people are incarcerated and work at each facility puts the numbers of total cases, deaths, and tests in context. Finally, knowing the number of people who have been vaccinated is critical to understanding how the pandemic is being managed behind bars. No agency, with the exception of the West Virginia Division of Corrections and Rehabilitation, reports all 12 variables, and even this agency fails to report some variables at the facility level.
To assign scores for data reporting, we first assessed whether an agency reports each variable at all, and then whether it reports the variable in statewide aggregates or at the facility level. The scores allocated to these variables ranged from 0-2: 0 points if the variable is not reported, 1 point if the agency only reports statewide aggregates, and 2 points if the agency provides facility-level data for that variable.
Data Reporting Metrics:
- Cumulative cases: The agency reports the number of incarcerated people/staff who have ever tested positive for COVID-19. (Note: Some agencies remove cases from their total number of cases when people are released, which means their numbers are not true cumulative case counts. These agencies do not receive points for the metric, but we still report these data because they are the only data available.)
- Cumulative deaths: The agency reports the total number of incarcerated people/staff who have died with or from COVID-19. (Note: Some agencies do not include people who were positive for COVID-19 but were found to have died of another cause. We believe that agencies should include all people who had COVID-19 at the time of their death and note whether infection was indicated as the direct cause of their death.)
- Cumulative tests: The agency reports the total number of tests performed on incarcerated individuals and/or the total number performed on staff throughout the pandemic. (Note: While a few agencies report the number of people tested, agencies only receive points for reporting the number of tests administered. Monitoring for COVID-19 requires regular testing and reporting only the number of people tested obscures the regularity of testing.)
- Active cases: The agency reports the total number of incarcerated individuals/staff who have an active COVID-19 infection and have not been deemed recovered.
- Population: The agency reports the total number of incarcerated individuals/staff within a particular facility. (Note: As with all other metrics, agencies only receive points for including total population on their COVID-19 dashboards, not for reporting population elsewhere on their website.)
- Vaccinations: The agency reports the number of incarcerated people/staff who have received at least one dose of the vaccine, the number who have completed their vaccination schedule, and/or the number of vaccine doses administered. (Note: While we assigned points for multiple types of vaccination variables, we urge agencies to report the number of people who have received doses, rather than the number of doses administered. Given that vaccines vary in the number of doses required, it is not possible to discern how many people have been vaccinated when agencies only report the number of doses or do not specify the definition of its vaccination variable. Agencies that do not clearly define which vaccine variable they are reporting lost points for the “clearly defined” metric.
Data Quality
The data quality section of the scorecard consists of four metrics related to the manner in which agencies report the ten variables mentioned above. We assessed agencies on whether or not their data are presented in a format that can be easily read by computer software, whether they report data on a regular basis (i.e., at least weekly), whether they clearly define the variables they report, and whether they display any historical data for at least one of these variables. Although machine-readability may only be important to a particular set of data users, it is a critical feature of functional dashboards that enable researchers to collect and compare data efficiently.
Each data quality metric was assessed on a binary metric: 2 points were awarded for ‘Yes’ and 0 points for ‘No’. We awarded 2 points for ‘Yes’ rather than 1 so that the data quality metrics were weighted equally to the data reporting metrics.
Data Quality Metrics
- Machine readable: Data are presented in API, json, csv, or xml formats. Static images, pdfs, and html formats are not considered machine readable.
- Regularly updated: Data are updated at least once per week, with a visible timestamp.
- Clearly defined: Variable definitions are visible on the agency website (e.g., in a data dictionary or table footnotes).
- Contextualized historically: Historical data for at least one of the key variables are displayed on the agency website.
There are several nuanced issues with data quality that were not captured by the above metrics. For example, we have observed unexplained fluctuations in the total number of COVID-19 tests and deaths reported by the Pennsylvania Department of Corrections. In response to inquiries about the inconsistencies, the agency took its dashboard offline in late January to make adjustments and only recently reinstated it. In the intervening months, the PA DOC lost points for the data it was missing as of the scoring date, but we did not alter grades for changes in reporting.
A different but related issue exists with the data reported by the correctional departments in Florida, Arkansas, and Wyoming. Over the course of the pandemic, the agencies have gradually reduced the granularity of data included on their dashboards, becoming less transparent over time. The DOCs only earn points for the variables they report at the time we assess them, regardless of what they have reported in the past.
Where such issues exist, raising specific concerns about data transparency, we have noted and briefly explained each issue that we have observed on each state’s page. While not comprehensive, these notes provide important context about agencies’ reporting practices.
Assigning Letter Grades
We assigned standard letter grades to each agency based on the percentage of points earned out of a maximum total of 32. The letter grades are associated with score ranges as follows:
A: 29-32
B: 26-28
C: 23-25
D: 20-22
F: <19
We will continue to reassess scores on a monthly basis. Please let us know if you use this scorecard as a tool to advocate for better data transparency and quality in your state.
Carceral Agency Scores
Carceral Agency | Overall | Data Quality | Reporting for Incarcerated People | Reporting for Staff |
---|---|---|---|---|
BOP | F(18 / 32) | 4 / 8 | 7 / 12 | 7 / 12 |
ICE | F(10 / 32) | 2 / 8 | 8 / 12 | 0 / 12 |
Alabama | D(20 / 32) | 4 / 8 | 8 / 12 | 8 / 12 |
Alaska | F(9 / 32) | 4 / 8 | 5 / 12 | 0 / 12 |
Arizona | F(10 / 32) | 2 / 8 | 7 / 12 | 1 / 12 |
Arkansas | F(7 / 32) | 4 / 8 | 2 / 12 | 1 / 12 |
California | B(26 / 32) | 6 / 8 | 10 / 12 | 10/ 12 |
Colorado | F(16 / 32) | 4 / 8 | 9 / 12 | 3 / 12 |
Connecticut | F(9 / 32) | 2 / 8 | 5 / 12 | 2 / 12 |
Delaware | F(13 / 32) | 2 / 8 | 7 / 12 | 4 / 12 |
District of Columbia | F(10 / 32) | 6 / 8 | 2 / 12 | 2 / 12 |
Florida | F(6 / 32) | 2 / 8 | 3 / 12 | 1 / 12 |
Georgia | F(15 / 32) | 2 / 8 | 7 / 12 | 6 / 12 |
Hawaii | F(12 / 32) | 2 / 8 | 8 / 12 | 2 / 12 |
Idaho | F(14 / 32) | 4 / 8 | 6 / 12 | 4 / 12 |
Illinois | F(12 / 32) | 0 / 8 | 6 / 12 | 6 / 12 |
Indiana | D(20 / 32) | 4 / 8 | 8 / 12 | 8 / 12 |
Iowa | F(18 / 32) | 4 / 8 | 8 / 12 | 6 / 12 |
Kansas | F(18 / 32) | 2 / 8 | 8 / 12 | 8 / 12 |
Kentucky | F(14 / 32) | 2 / 8 | 6 / 12 | 6 / 12 |
Louisiana | F(16 / 32) | 2 / 8 | 8 / 12 | 6 / 12 |
Maine | F(8 / 32) | 2 / 8 | 6 / 12 | 0 / 12 |
Maryland | D(20 / 32) | 4 / 8 | 8 / 12 | 8 / 12 |
Massachusetts | F(0 / 32) | 0 / 8 | 0 / 12 | 0 / 12 |
Michigan | F(16 / 32) | 4 / 8 | 7 / 12 | 5 / 12 |
Minnesota | D(22 / 32) | 6 / 8 | 12 / 12 | 4/ 12 |
Mississippi | F(4 / 32) | 0 / 8 | 4 / 12 | 0 / 12 |
Missouri | F(10 / 32) | 0 / 8 | 6 / 12 | 4 / 12 |
Montana | F(8 / 32) | 2 / 8 | 3 / 12 | 3 / 12 |
Nebraska | F(7 / 32) | 2 / 8 | 5 / 12 | 0 / 12 |
Nevada | F(10 / 32) | 2 / 8 | 4 / 12 | 4 / 12 |
New Hampshire | F(18 / 32) | 2 / 8 | 11 / 12 | 5/ 12 |
New Jersey | F(10 / 32) | 2 / 8 | 5 / 12 | 3 / 12 |
New Mexico | F(9 / 32) | 2 / 8 | 7 / 12 | 0 / 12 |
New York | F(8 / 32) | 2 / 8 | 4 / 12 | 2 / 12 |
North Carolina | F(19 / 32) | 8 / 8 | 10 / 12 | 1/ 12 |
North Dakota | D(22 / 32) | 4 / 8 | 10 / 12 | 8/ 12 |
Ohio | F(14 / 32) | 4 / 8 | 4 / 12 | 6 / 12 |
Oklahoma | F(8 / 32) | 2 / 8 | 5 / 12 | 1 / 12 |
Oregon | F(17 / 32) | 8 / 8 | 7 / 12 | 2 / 12 |
Pennsylvania | C(24 / 32) | 6 / 8 | 12 / 12 | 6/ 12 |
Rhode Island | F(12 / 32) | 4 / 8 | 4 / 12 | 4 / 12 |
South Carolina | F(14 / 32) | 2 / 8 | 6 / 12 | 6 / 12 |
South Dakota | F(14 / 32) | 2 / 8 | 6 / 12 | 6 / 12 |
Tennessee | F(16 / 32) | 2 / 8 | 9 / 12 | 5 / 12 |
Texas | F(14 / 32) | 6 / 8 | 5 / 12 | 3 / 12 |
Utah | F(11 / 32) | 4 / 8 | 6 / 12 | 1 / 12 |
Vermont | F(13 / 32) | 4 / 8 | 5 / 12 | 4 / 12 |
Virginia | F(13 / 32) | 2 / 8 | 7 / 12 | 4 / 12 |
Washington | F(13 / 32) | 2 / 8 | 6 / 12 | 5 / 12 |
West Virginia | B(26 / 32) | 8 / 8 | 12 / 12 | 6/ 12 |
Wisconsin | D(21 / 32) | 8 / 8 | 10 / 12 | 3/ 12 |
Wyoming | F(5 / 32) | 2 / 8 | 3 / 12 | 0 / 12 |
next post
June 21st, 2021 • Erika Tyagi, Poornima Rajeshwar, and Liz DeWolf
FINAL UPDATE: Data Reporting & Quality Scorecard, Round 4
Since March, we’ve been assigning scores to carceral agencies based on the granularity of the COVID-19 variables they report, as well as the quality of those data. This month, as with the two previous scoring rounds, more than 80% of agencies received an F.