Jack’s Car skimmed down a slip road at 60 miles an hour, comfortably decelerating to an even 50 miles an hour as the road evened out. This deceleration to a slower speed was due to the decreased visibility of the road. Jack’s Car did not decelerate to 40 miles per hour because it was a dry night rather than a wet night, but it was a cloudy night. The route Jack’s Car took diverged from the motorway. Jack’s Car took this route because the motorway had many cars on it. This means that a car may get snarled up in a traffic jam. It also increases the chance of an accident happening due to many cars being on the road. Rather than risk the increased likelihood of an accident, the alternative was to leave the motorway via a slip road which leads to a lesser used, less well tarmacked road, cutting directly from one side of a plot of land to the other in a straight line. The land is used for tree farming and is not well lit. This is why Jack’s car decelerated to 50 miles per hour, and in addition, turned up it’s headlights to full beam. An accident is much less likely on this road than on the motorway, for although it is not as well tarmacked, the journey becomes shorter and there are less cars sharing the road.
Jack was at home, waiting for his car, which was returning from a drive-through. Jack’s Car left Jack’s house for the drive-through, which is attached to a supermarket, at around 19:00 because other car owners like to have their shopping at home by the time they return from work. Jack does not have this preference enabled, so Jack’s car waits until there are fewer cars on the road before leaving. There is always traffic on the road motorway however, so the best route to and from the supermarket often involves cutting across the tree farm. In addition to fewer instances of car accidents, this road had the benefit of being shorter, and therefore more fuel efficient and therefore better for the environment and it is also cheaper.
Jack’s Car drove along the road in near silence, bar a low hum which Jack’s Car (and all other cars) emitted to warn pedestrians that a car was driving towards them. There were never pedestrians on this particular road, or any road, but all cars hummed all the time because it was legislated. However, Jack’s Car was always ready to decelerate in response to a pedestrian stepping into the road at any time and was always ready to obey the directions of markings on the road. Jack’s Car was familiar with road markings which issued instructions to merge, give way etc, even when Jack’s Car had not encountered a particular set on markings the marking on that particular road. Road markings sometimes are changed and Jack’s Car needed to read the markings anew each time it encountered them, in case they had changed. This is why as Jack’s car was equidistant from the entrance and exit of the tree farm, Jack’s car detected a new dashed line followed by a solid line, and recognised it as ‘right of way’ and passed over it. This is also why, when having passed over the line and finding itself confronted with a solid line followed by a dashed line meaning ‘no entry’, Jack’s Car stopped.
This trap, which trapped Jack’s Car, is simply laid by drawing a pair of concentric circles on the road. The outer dashed, the inner solid. From outside – right of way – from inside – no entry.
A group of pedestrians stepped out from the gloom between the trees and made their way to Jack’s Car. Jack’s Car saw the pedestrians as they stepped into it’s high beams. They hummed as they made their way closer.
James Bridle’s Autonomous Trap series, (in which the artists ‘traps’ autonomous cars in the manner described above) describes the dichotomy of the algorithm as both slavishly procedural and logic driven, and mystified and inscrutable. While it may appear to support the idea of the algorithm as procedural to a fault, the act of trapping the car in a magic inspired ring of salt demystifies the algorithm three-fold. It allows the pedestrian to arrest the procedure by subverting the rules, allows them to do so using analogue tools, therefore undermining the algorithm as ceaseless or incomprehensible. As I read it, the work is a call to arms, a demystification of the algorithm and an invitation to think inventively about their limitations.
After reading Tarleton Gillespie’s Algorithm [draft] [#digitalkeywords], the idea which seems to me most potent, was the idea of the algorithm as a ‘talisman’. The talisman has the power to ward off culpability, absorb blame or anoint the actions of its author.
The idea of the algorithm as autonomous from the author can be comforting. It suggests impartiality and fairness, utilitarianism and efficiency. And in many cases this is true, but fair and efficient for whom? Pay no attention to the man behind the curtain.
Although many driverless cars use a range of methods to detect objects – radar for example – computer vision systems are cheaper and potentially a more market friendly option. However, researchers from Virgina Tech found that machine vision systems are consistently poorer at detecting people with darker skin tones than people with fairer skin tones. This was true even when testing the object detection systems when they removed occluded pedestrians and tested only using images of people in full view:
[…](small pedestrians and occluded pedestrians) are known difficult cases for object detectors, so even on the relatively “easy” subset of pedestrian examples, we observe this predictive inequity. We have shown that simple changes during learning (namely, reweighting the terms in the loss function) can partially mitigate this disparity. We hope this study provides compelling evidence of the real problem that may arise if this source of capture bias is not considered before deploying these sort of recognition models.
This is not a flaw in the algorithm, but a flaw in its training data.
An algorithm has an author or set of authors. While the logic driven procedure of the algorithm may function impartially and fairly, it’s basing its decisions on data it was previously trained on during its development. If this data has not been sufficiently scrutinised, as in the case of the object detection software examined in the Virginia Tech Study, it may enact the biases of its author – conscious or otherwise.
I think the thing to remember is that the algorithm is doing it’s job – we might just not know who hired it.