Highlighting harms when writing design patterns

I enjoy writing design patterns.

I find them a useful way to clarify my thinking around different solutions to problems across a whole range of areas. A well-named pattern can also help to clarify and focus discussion.

I’ve written a whole book of Linked Data patterns and lead a team that produced a set of design patterns for collaborative maintenance of data.

I’ve been planning to revisit some writing and thinking I’ve been doing around capturing design patterns for different models of data access, sharing, governance and modelling. Given all the confusing jargon that is thrown around in this space, I think writing some design patterns might help.

When I’ve been writing design patterns in the past I’ve used a fairly common template:

  • a short description of a problem
  • the context in which this problem might surface
  • an outline of a solution
  • some examples
  • a discussion which might discuss variations of the solution or its limitations
  • links to other patterns

But I’ve been thinking about iterating on this to add a new section: harms.

This would lay out the potential consequences or unintended side effects of adopting the pattern. Both to the system in which they are implemented, but also more broadly.

I started thinking about this after reading a paper that discusses the downsides of poor modelling of sex and gender in health datasets used in machine-learning. I’d highly recommend reading the paper, regardless of whether you work in health or machine-learning. This paper about database design in public services is also work a look while you’re at it. I wrote a summary.

While the sex/gender paper doesn’t describe the issues in terms of design patterns, it’s largely a discussion of the impacts of specific data modelling decisions.

Some of these decisions are just poor. Capturing unnecessary personal data. Simplistic approaches to describing sex and gender.

Work on design patterns has long attempted to highlight poor designs. For example by describing “anti-patterns” or “deceptive design patterns” (don’t call them dark patterns).

But some of the design decisions highlighted in that research paper are more nuanced. Decisions which may have been justified within the scope of a specific system, where their limitations may be understood and minimised, but whose impacts are greatly amplified as data is lifted out of its original context and reused.

This means that there’s not a simple good vs bad decision to record as a pattern. We need an understanding of the potential consequences and harms as an integral part of any pattern.

Some pattern templates include sections for “resulting context” which can be used to capture side effects. But I think a more clearly labelled “harms” section might be better.

If you’ve seen good examples of design patterns that also discuss harms, I’d be interested to read them.