Image for post
Image for post
No, not like this. Source

Courts and state bar associations are (finally) beginning to engage with “robot lawyers” — software applications that guide a user through some legal process. In Utah, the state court system has even gone so far as to set up a regulatory sandbox for testing and evaluating alternative means of delivering legal services, from software applications to non-lawyer ownership of law firms.

It is tempting to think about the regulatory challenge here as merely deciding what a new market for legal services ought to look like. Courts should resist this temptation, and consider why the legal profession has ethical rules and strong fiduciary duties in the first place: to maintain trust in the law and legal institutions. Using this as a starting point changes the frame. It suggests a regulatory approach that is tailored to the unique strengths and weaknesses of client-facing legal software. “Robot lawyers” are not lawyers. They are software, and we shouldn’t think of them like humans. …


Our quest to build better, fairer courts is missing a critical ingredient: open API standards for court management systems, along with procurement requirements that mandate interoperability and prevent courts from being locked in to a specific software vendor.

A looming Supreme Court ruling in Oracle v. Google could make risks of vendor lock-in even worse, and underscores the urgent need for courts to collaborate on open, interoperable APIs.

A quick background: what is an API?

Oracle v. Google is a case about the copyrightability of software APIs. In simple terms, an API is a way to access a set of functions in software or code. When competitors use common or open APIs for the software they build, it makes it easier for users to switch between them, or to mix and match software from different vendors. …


UPDATE: You can try the simulation now at https://detainrelease.com

To date, the debate surrounding pretrial algorithmic risk assessment tools has focused on statistical quality and overall legality. In 2017, when Jason Tashea and I first taught our Georgetown Law students about risk assessments, our lesson summarized those debates in a lecture and discussion. Afterwards, we worried that students were taking away a simple, wrong lesson: if a software tool is sufficiently “accurate”, it will solve the problem presented and it does not require further investigation.

The discussion shouldn’t end there. Software has framing power: the mere presence of a risk assessment tool can reframe a judge’s decision-making process and induce new biases, regardless of the tool’s quality. This past fall, we wanted our students to engage with this broader, ecosystem-level issue, and to understand the far-reaching consequences that pretrial detention can have on defendants. …


“What makes something simple or complex? It’s not the number of dials or controls or how many features it has: It is whether the person using the device has a good conceptual model of how it operates.”
—Don Norman, “Living with Complexity”

“To define is to limit.”
—Oscar Wilde, “The Picture of Dorian Gray”

This is the second in a series of essays about open logic for public services. (You can find the first here.) Together, they argue for public systems that are built to be understood and navigated. …


Nonprofits serve the public interest. When they go under, their assets can be lost. Trusts can preserve public goods and protect them from the abyss of bankruptcy.

Last month, a trove of publicly-funded cancer research at The Center of Cancer Systems Biology was lost after the Center’s nonprofit fiduciary, Genesys Research Institute (GRI), collapsed into bankruptcy. Like many stories that go this way, the end came suddenly, then slowly: GRI closed the Center in September 2014, filed for bankruptcy protection nine months later, and then spent “months” trying to find a new home for the research material, including “thousands of little glass tubes of cells and proteins, pieces of human tumor tissue, and other biological samples”, representing a decade’s worth of work. …


It is exceptional that one should be able to acquire the understanding of a process without having previously acquired a deep familiarity with running it, with using it, before one has assimilated it in an instinctive and empirical way. — John von Neumann, “The Mathematician”

A person is not a machine, and should not be forced to think like one.
— Bret Victor, “Learnable Programming”

This is the first in a series of essays about open logic for public services. (You can find the second here.) Together, they argue for public systems that are built to be understood and navigated. This is an argument for open rules, logic, and algorithms, and for interfaces designed to help people learn about the complex systems they rely on every day, so they can improve their lives. …


This post was originally published in the fall of 2014.

Ello has gathered a fair amount of attention (if not actual market share) on the back of promises to never sell user data in order to raise revenue. Partially in response to worries over the company’s acceptance of VC money, Ello has embedded this promise into its corporate charter, which requires 90% of the company’s voting power in order to amend.

This approach only addresses a narrow subset (protecting Ello against activist investors) of a much broader and more fundamental problem: can users trust Ello to keep its promises? …

About

Keith Porcaro

Lawyer, technologist. Affiliate at Berkman Klein Center & Duke Center on Law and Technology. Adjunct prof. at Georgetown Law.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store