The polarisation and segregation fuelled by algorithms may end up shaping the city. AI feeds off itself and, as if it were an infinite mirror, it insists on showing us what it knows, but it does not put forward alternative models. As citizens we must learn to use it, so to prevent it from taking over our decisions.
“Getting back to the city, the reverberations of AI may end up shaping it and giving rise to a more uniform social organisation, in which errors, the unusual, the random, the spontaneous and the bizarre disappear. That the economic and social model we oppose – a model in which poverty and opulence are part of the way it works – may end up not being called into question but, on the contrary, optimised. That urban space becomes an arena for micro-targeting. These are all risks that we will have to address and be constantly on the alert for. Let us resist the pleasure of a friendly, clear and clean city. Let us look behind it. Let the city make us uneasy, both in terms of what we see in it and in terms of the trouble we have in understanding it.”