Transmission in Motion

December

“On the pedagogy of algorithms” – Tamalone van den Eijnden

The underlying idea of pedagogy is that parents, teachers and other adults bear responsibility for the upbringing, education and conduct of children. These responsibilities range from the mental life of the child to behavioural manners to legal liability. Ideas of the responsibility in education are not only limited to human-human relations but are equally applicable to trans-species alliances. For instance, consider human-dog relations: it is the owner’s task to teach the dog simple commands, the owner has to ensure that there is no faecal matter on the pavement, and it seems to be that “the tension flows down the leash,”[i] meaning that dogs adapt the emotional sensitivity of their owners: A more nervous owner is likely to have a more nervous dog.

Today, in the age of algorithms, we are extending ‘our pedagogical responsibilities’ also towards the inanimate. In his talk Image Classification Using Deep Neural Networks Melvin Wevers discussed how computational neural networks can be used to identify handwriting and recognizing objects. Specifically, the question was raised, how archives of visual material can be used for more targeted research? It is the quest for a tool, which would save the academic from manually revising all the material (e.g. all the advertisements since the 1950s) to subsequently compile all the relevant images (e.g. only cigarette advertisements), it is the quest for a tool, which would allow the scholar to retrieve with a few keywords the relevant corpus of analysis. Since it would be an endless project to tag all the images manually with the relevant metadata, in Wever’s words, an algorithm needs to be ‘trained’ to recognize images. The word ‘training’ is representative of a larger discourse in the context of new and complex technologies, including ‘machine learning,’ ‘deep learning,’ and ‘neural networks.’

At first, the word ‘training’ may seem an out-of-place anthropomorphism. However, I think this metaphorical understanding may point to another important aspect of ‘training algorithms,’ namely the question of responsibility. We must understand that those designing algorithms leave traces of their (unavoidable) biases, in other words, just like the child or the dog, the algorithm ‘can’t help’ how it is ‘brought up.’ For example, what will appear under the keyword search ‘house’ depends very much on the coder’s awareness of how a house might look like, which varies according to socio-cultural background and personal opinion. How can we establish useful categories and make these biases transparent? The humanities might provide an important contribution towards an ‘ethics of training’ in the field of technological innovation.

This new ethics of training of algorithms may also consider that pedagogy is often conceptualized as a two-way process. Parents may see through the eyes of their children and teachers may learn from their students. Sometimes, the questions of somebody who still did not fully internalize the logics of a certain system are particularly stimulating. Interestingly, also the algorithm refreshingly ‘does not understand.’ When Wevers conducted his image search, the result was a series of houses with white triangular straw roofs. The algorithm works by matching corresponding visual patterns, which are processed as a matrix of numbers. It does not understand the concept of a roof. This is why next to a series of alike looking houses it also produces images of a white wedding dress and a strongly perspectival road, since the visual patterns, detached from meaning, are quite similar. In all three cases, we see a white triangular shape. Referring to this very instance Maaike Bleeker asked: what can the algorithm teach us? How can it inspire us to new ways of thinking? Thus, a pedagogy of algorithms should not only be concerned with the responsibilities of teaching but equally with the responsibility to learn.


[i] Coren, Stanley. “Do Nervous Dog Owners Have Nervous Dogs?” Psychology Today, February 16, 2017.