Part 2

What criminology already knows

A curated body of work on how people actually use justice-tech — inmate tablets, parole apps, body cameras, predictive tools, e-filing, victim notification, tip lines, digital evidence, and visitation.

~1,500 + ~75 entries words · reading time scales with you

Why this part exists

Justice-tech is a research-rich field that has been almost invisible to mainstream UXR. The studies are scattered across criminology journals, foundation reports, and government evaluations, and they are rarely framed as user research even when that is exactly what they are. The bibliography that follows tries to gather the most decision-useful work in one place, annotated for someone who has to design, evaluate, or fund a piece of justice-tech this quarter.

The inclusion bar is strict. An entry has to say something specific about how a real person actually uses, struggles with, or is affected by a piece of technology in the justice system. Bias audits without human-factors content are out. Op-eds are out. Press releases dressed up as studies are out. If we cannot write a non-obvious eighty-word annotation, we cut the entry. The goal is signal density, not completeness.

One honest consequence of that rule: a few subject areas come out thinner than others. The peer-reviewed, user-facing research on tip lines, inmate tablets, victim notification, digital-evidence workflows, and jail video visitation is genuinely sparser than the research on body-worn cameras, court e-filing, or risk assessment. We treat the gap as a finding, not a failure of the bibliography. Where a UX researcher is the first person asking these questions in a given subdomain, that is worth knowing up front, and the current per-area counts are flagged on the bibliography page itself so a reader can see at a glance where the literature is strong and where it is still being built.

If we cannot write a non-obvious eighty-word annotation, we cut it.
— The inclusion rule, in one line

How the bibliography is organized

Every entry is tagged on three axes so a reader can find what is useful in under a minute. The bibliography page lets you filter on each axis; every entry carries an export block that copies a Zotero-friendly citation or a BibTeX stanza to your clipboard.

By user type

Who is the study actually about? The labels are deliberately blunt, because the design choices for an inmate using a tablet are not the design choices for an analyst using a risk-assessment tool.

By method

How was the evidence generated? Mixing methods is the rule rather than the exception, but we tag by the dominant method to keep the filters useful.

By difficulty

How approachable is the entry for someone new to the area? The three levels are a reading map, not a quality judgment.

The nine subject areas

The bibliography is organized around nine clusters where justice-tech and UX research already overlap. Each cluster has a short brief below; the entries themselves live on the bibliography page.

Inmate tablets and in-cell technology

Tablets, kiosks, and tablet-delivered education are now the default surface for incarcerated users in most U.S. systems. The user experience question is what happens when a captive market meets a high-friction payments model — and what happens to the families on the outside who pay the bills.

Parole and probation check-in apps

Smartphone check-ins, GPS monitors, and automated reporting tools have replaced a meaningful slice of in-person supervision. The UX questions are about burden, accuracy, and the asymmetry of information between supervisor and supervisee.

Body-worn cameras: the officer side

Most of the BWC literature is about citizen-facing effects. A smaller, useful body of work studies the officer experience: when cameras get activated, how footage gets reviewed, and how the burden of evidence management lands inside a department.

Predictive policing and risk-assessment tool use

The bias debate is well-known. The user-research question is narrower and less crowded: how do officers and analysts actually read these tools, what do they ignore, and where does the human-in-the-loop break down?

Court e-filing and self-represented litigants

Federal CM/ECF and state e-filing portals are some of the highest-stakes, lowest-budget user interfaces in the country. The research base on self-represented-litigant usability is small, growing, and almost entirely underused by the courts that commission it.

Victim notification

Automated victim-notification systems (VINE and equivalents) have been in production for thirty years. The UX questions are about whether notifications are timely, intelligible, and trauma-informed — and what happens when they are not.

Tip lines and anonymous reporting

Crime Stoppers apps, school-safety reporting (Safe2Tell, SafeUT, P3 Tips), and community-tip platforms are a quiet UX category. The research questions are about trust, anonymity, false reports, and the operator side of the screen.

Digital-evidence tools

Evidence-management platforms, disclosure portals, and digital-forensics workflows sit between police, prosecutors, defense, and labs. The UX failures are often cross-organizational — chain-of-custody confusion, version drift, broken handoffs — and the user research on them is thin.

Jail visitation platforms

Video visitation has replaced in-person visits in many jurisdictions. The research is clearest on the structural effects (visit frequency, family contact) and thinner on the moment-to-moment UX of the calls themselves.

Open the bibliography →