Ethnographer Loup Cellard has a new paper out, Algorithms as figures. In general it’s a thoughtful intervention, quite aware that the term is a disputed one, that has uses which have moved a long way from its software engineering design function; to the point where they sometimes barely describe the same construct at all.

There’s a telling passage about halfway through, where French civil servants insist a paper-based bureaucratic decision procedure should be understood as an algorithm.

For example, in response to freedom of information (FOI) requests, the French administrative regulator Commission of Access to Administrative Document (CADA) qualified the following systems as algorithms: a decision tree used to coordinate the intervention of firefighters and ambulances, the calculus of pensions for independent workers or even a paper-based grid of criteria used to score secondary schools students willing to enter a path of excellence in a special high school.

And of course this isn’t all wrong. A key property to an algorithm is that it is computational, but not that it necessarily runs on a digital computer. An axe-head made of stone and an axe-head made of iron are both axe-heads, and they both cut things. Presumably these procedures can even be analyzed for performance in linear or quadratic time - though they mostly sound like nested if statements. At the same time, if there is no abstraction of the information-manipulation properties from the domain context, eg through parameters, a key engineering point of algorithm design is missed. An algorithm is an abstracted material component which can be formally shown to have certain mathematical properties. Cellard thinks about and acknowledges this software basis for the term.

On the other hand, even if though it is possible to implement an algorithm in another media, like a paper-based bureaucratic procedure, or a Hungarian folk dance, it’s usually not a very practical choice of material, due to the bandwidth, processing speed, and error rates involved. The sophisticated biological and social implementations of algorithms that exist out there tend to have poor modularity; you need to rework them heavily to abstract out the information-shuffling parts to reuse them.

Now, Cellard’s piece doesn’t find the need to spend much time on all of that, but as good ethnography, it observes, reports, participates and theorizes about what people actually do with the term in a particular social context.

As a device, the FOI requests performatively creates the very reality of the entity it is supposed to make accountable. In other words, by trying to find a solution to unfair administrative decisions, the CADA has provoked the coming into being of the entity ‘algorithm’. If the algorithm is then not attached anymore to the digital environment, it becomes a post-digital entity used to envision administrative procedures as ‘algorithmic’ instructions, a practical scheme with explanatory effects.

Cellard draws on post-digital theorists David Berry and Florian Cramer to avoid being tied to algorithms as purely digital artifacts. To me, it also helps highlight that much concern about “algorithms” is really about decisions, whether made as individual judgement calls, or according to some decision procedure in a rule book or code. Cosma Shalizi frames this perspective well in this informal intro, seeing the technical aspect of the problem as one of solving for statistical constraints, say by testing for types of bias. One of the things I like about Shalizi’s viewpoint is it makes all decisions made within black boxes fair game, whether it’s a human or a Turing machine inside.

Cellard’s “algorithm as figure” connects with this decision-centric viewpoint. To me, it even suggests a “desire for the algorithmic” in the community being analyzed, one that is more about systemisation and evoking a deciding agent than it is about quicksort or trading-off engineering constraints. We could even say that misnamed decision agent is an unattainable aesthetic ideal. The romaticized computer-governor can’t be obtained because a misunderstood component has been substituted for an image of the whole.


References

Babbitt, W., Lachney, M., Bulley, E., & Eglash, R. (2015). Adinkra Mathematics: A study of Ethnocomputing in Ghana. Multidisciplinary Journal of Educational Research, 5(2), 110–135. https://doi.org/10.17583/remie.2015.1399

Cellard, L. (2022). Algorithms as figures: Towards a post-digital ethnography of algorithmic contexts. New Media & Society, 24(4), 982–1000. https://doi.org/10.1177/14614448221079032

Shalizi, C. (2022, April 14). Ethical and Political Issues in Data Mining, Especially Unfairness in Automated Decision Making. http://bactra.org/notebooks/ethics-politics-data-mining.html