Crisis of content exhibition - King's College, University of Cambridge, Cambridge Festival 2023
What are the questions we should be asking about the rise of algorithmic control, especially when it affects copyright law and the access to knowledge? What might happen if a copyright checking algorithm became conscious of its work? What might it desire?
On the surface it may seem like a good idea that copyright holders should be able to find and take down unauthorised use of their ideas on platforms such as YouTube. In practice there are gaps, with untold effects in many areas, as evidenced by Professor Henning Grosse Ruse-Khan’s research. It is vital that we continue to interrogate their use; to investigate the areas that private companies, or those monetarily involved in a project don’t acknowledge or may not wish to see.
In the book - What algorithms want: Imagination in the Age of Computing - Ed Finn talks of the ‘gap’, the shadow that comes about when algorithmic concepts are abstracted from reality. “But every abstraction has a shadow, a puddled remainder of context and specificity left behind in the act of lifting some idea to a higher plane of thought.” How might we explore these shadows?
What are the questions we should be asking about the rise of algorithmic control, especially when it affects copyright law and the access to knowledge? What might happen if a copyright checking algorithm became conscious of its work? What might it desire? What does it understand of the data it parses; of the companies and individuals who try to game the system and monetise the results? Does it understand systems of knowledge?
This work examines thoughts from the point of view of YouTube’s copyright content checking algorithm itself, speculating on what might happen if it becomes sentient. Guided with a range of choices, we asked the text-based AI machine learning algorithm GPT-3 carefully engineered questions through a web-portal to generate the artwork. The format of the text given to prompt the AI is very important, even slight changes can alter the result immensely. The process of using an AI algorithm to question itself becomes the artwork.
Like a machine learning algorithm, everything we know is built on the knowledge which has gone before.