Dark Discourse

At SXSW recently, IDEO presented a robot pet: a furry blob designed to trigger mammalian caregiving instincts. Soft fur, leopard spots, curled sleeping posture. The “push” (in IDEO-speak): “we're inventing a new life-like form unconstrained by biology, so why do we keep making robots look like sad metal humans?” The design question being asked is: what shape should the robot take so people will trust it, love it, not fear it?*

Not “what is this thing doing?” Not “who controls it?”, “what data does it collect?”, “what energy does it burn?”, “whose interests does it serve?”. Those questions are basically unaskable in the default “human-centered” design discourse.

The design operation here is domestication, not only of the object but of the viewer's perception. The fur and the sleeping pose domesticate your response to a system that is not domestic, not alive, not dependent on your care, and quite possibly not serving you. Beware objects that seek to become your pet, lest you become their pet.

Is this a dark pattern? In the classic sense (roach motels, confirmshaming, trick questions), a dark pattern assumes a designer who sees through the deception and deploys it strategically. The designer is the trick's performer, and the user is the victim.

A domesticated robot is different. There is a trick being played, but the designer is not the performer but the first victim. The fur and the sleeping pose, features that trigger the human care response, render the full consequences of the object illegible. But the designer is not fully aware that they are producing something other than an expression of the friendly possibilities of the designed object, so it's not really a dark pattern in the way dark patterns are usually understood.

The real darkness is in the discourse.

Human-centered design, as practiced, has a grammar. That grammar can express: affect, delight, trust, playfulness, form, accessibility, inclusion. It does not get to express: indirect harms, downstream harms, harms to non-user-actors, epistemic harms, ontological harms. It has no vocabulary for what the designed object does to people who never interact with it, or to systems that have no interface.

Zachary Kaiser's Interfaces and Us (bookshop, kobo)† makes this precise and explicit: the interface pre-specifies what a person is (a set of preferences, clicks, consent toggles), and everything outside that specification — the power asymmetry, the surveillance, the coercion — has no place in the representation. It's not just hidden, it is ontologically excluded.

A dark discourse is one where what's rendered unknowable is specifically the harm the discourse itself generates. It reproduces entire communities of good-faith practitioners, because the discourse is what trains them, credentials them, rewards them, and gives them their sense of professional identity. You can swap out every designer and the dark discourse persists.

There's a tension at the heart of “human-centric design” today, one that is exacerbated by the sheer power of technologies and the ruthlessness of those who would roll them out across the world. That tension is about how much a designed object should communicate its inner workings. The default approach, as the “stack” has become ever bigger and more complex, and its capabilities more advanced, has been to simplify the object's expression to its user to the greatest extent possible, and to hide its inner workings except where it needs to seek input from the user. This renders tools usable and powerful, sure, but it is also used to epistemically disconnect users from the harms the object might pose, either to the user themselves or to society or the biosphere at large.

The human-centric tradition's answer to cui bono — who benefits? — is in practice, despite protestations to the contrary, the user. A more accurate observation might be qui pendit, solus penditur — only the one who pays is counted.


* I want to make it super-clear that, while this post may have been triggered by seeing this particular thing on LinkedIn, I am seeing very few designers updating their assumptions about what constitutes “good design”, or what the role of the designer is in society, in the light of the last decade and a half of all-consuming technological/capital onslaught.

† A book whose subtitle, “User Experience Design and the Making of the Computable Subject”, is an eye-opener in itself