A Sunday Telegraph article
On Sunday, 20th November 2022, the following article appeared in the Sunday Telegraph.
Oxford to ‘decolonise’ computing degree and emphasise slavery links
Course is latest to be targeted as universities increasingly ‘go for woke’ in scientific subjects
By Ewan Somerville
Oxford has said it will “decolonise” computing degree courses because of alleged slavery links to machine learning. The university’s computer science department has overhauled modules to show students “how global histories of domination and subjugation have impacted the structures of science they see and the assumptions they encounter”.
It says it is committed to “understanding what it means to decolonise the curriculum and examining preconceptions that have been taken for granted for decades, if not centuries”.
The faculty, one of the oldest computer sciences hubs in the UK, says there is “growing awareness” that “new technologies can have a detrimental effect on individuals, communities and entire societies”.
But the department, headed by Prof Leslie Ann Goldberg, has come under fire for becoming “colonised” itself by US radical critical race theories.
“We need to go beyond understanding these effects to realise that they are often rooted in a colonial past that even at its most benign, sought to impose Western standards and understandings on other countries, and at its worst enslaved and reduced local populations, creating divisions and hierarchies of value that are replicated in the vast datasets so often used in machine learning,” faculty chiefs said.
Urging campus chiefs to ‘go woke’
It comes after it emerged last week that the body advising universities on degree standards is now urging campus chiefs to “go woke” by decolonising most subject areas.
The Quality Assurance Agency’s new advice says that computing courses should address “how divisions and hierarchies of colonial value are replicated and reinforced” within the subject. while maths curriculums “should present a multicultural and decolonised view”.
The Oxford computer sciences department has announced that “being non-racist is insufficient” because the university “has benefitted from and perpetuated attitudes and practices rooted in deeply wrong biases and prejudices”.
It claims that as a result, “carrying out research that is truly representative requires an anti-racist position”, including decolonisation and “rejecting the conscious and unconscious biases of the past”.
The faculty’s new modules include one on computers in society and another in ethics and responsible innovation.
The move comes as British universities increasingly turn their focus on the ills of the British Empire from humanities courses towards maths, science and engineering degrees – despite critics saying it has no relevance to the curriculum.
The department added that “computer science itself has been characterised as a colonial system, exporting technology designed for particular cultural and social contexts into other areas of the globe, without regard to local needs or contexts”.
Therefore, it says “there is an urgent need to work on decolonising digital innovation, digital content, and digital data to explore how databases and images might support indigenous knowledge systems”.
‘America’s grievance industrial complex’
Toby Young, head of the Free Speech Union, told The Telegraph: “With the capture of the Oxford computer science department, the colonisation of Britain’s universities by America’s grievance industrial complex is complete.
“Henceforth, it doesn’t matter what subject you choose to study at university whether it’s English or computer science – you will be taught about critical race theory.”
Among the research projects established by Oxford’s department is an “ethical hackathon” model to help students embed fairness and responsibility mechanisms into the design of tools and systems.
A University of Oxford spokesman said: “All faculties regularly review and update their course curricula to reflect the latest developments in the subject, and recent initiatives have broadened the topics that we teach and research.
“Most science courses include content covering the ethical and social issues around their subjects. Such content is often formally required by accrediting bodies such as the engineering institutions.”
Most of the material about Oxford Computer Science in that article appears as a string of quotations, and it's not too hard to find the source – a page on the University's "Oxford and Colonialism" website. Here's the content of that page in full, as it existed in 22nd November 2022, so you can see that the quotations in the Telegraph article are accurate.
"...decolonizing the curriculum means creating spaces and resources for a dialogue among all members of the university on how to imagine and envision all cultures and knowledge systems in the curriculum, and with respect to what is being taught and how it frames the world." [Keele University, 'Keele Manifesto for Decolonising the Curriculum']
The Department of Computer Science was founded in 1957, which makes it one of the oldest university computer science departments in the UK. Although relatively young by Oxford’s standards, it is nevertheless part of an ancient institution which, it is acknowledged, has benefitted from and perpetuated attitudes and practices rooted in deeply wrong biases and prejudices. The Department recognises and acknowledges its position within this institution and understands that being non-racist is insufficient – carrying out research that is truly representative requires an anti-racist position. This includes working on understanding what it means to decolonise the curriculum and examining preconceptions that have been taken for granted for decades, if not centuries.
One aspect of our work here is a growing awareness in computer science, and its related disciplines, that new technologies can have a detrimental effect on individuals, communities and entire societies. We need to go beyond understanding these effects to realise that they are often rooted in a colonial past that even at its most benign, sought to impose Western standards and understandings on other countries, and at its worst enslaved and reduced local populations, creating divisions and hierarchies of value that are replicated in the vast datasets so often used in machine learning. We teach our students about these issues in our courses, including Computers in Society, and Ethics and Responsible Innovation and encourage them to consider how such problems might be addressed. We believe this work must go further, to emphasise how global histories of domination and subjugation have impacted the structures of science they see and the assumptions they encounter. We can further encourage our students to reflect on their own role whilst they are studying, and then in their future careers: we can highlight their responsibility to consider questions such as, what values do we have as computer scientists? How can we ensure we work for the social good? What assumptions are we bringing to our work?
Several of our research projects and activities also seek to co-create technological innovation and develop novel methods to address structural social inequalities. Using an “ethical hackathon” model, we work with students to consider how ‘fairness’ and responsibility mechanisms can be embedded into the design of tools and systems. For example, with colleagues in InSiS and from the Harare Institute of Technology in Zimbabwe, we broadened the ethical hackathon to a laboratory hackathon which tackled the challenge of resource-scarcity in STEM education in southern Africa. The students who participated in the hackathon used their expert local knowledge of the STEM education context and the availability of resources to prototype low-cost lab equipment that could be easily replicated. Similarly, several projects seek to develop low-cost open-source conservation technology including novel wildlife tracking solutions using GPS, acoustics and motion monitoring. These projects focus on working with conservationists to create efficient, effective low-cost manufacturing and distribution models that can challenge the – often prohibitively priced – commercial technology. In these projects, we are directed by local researchers and collaborators, and seek to support their work, rather than attempting to impose our own assumptions on them.
Much of our work also centres on positive ways in which innovations and technologies can be channelled to the benefit of society. One of our projects helps to train UK doctoral students in Responsible Research and Innovation – this is a way of carrying out research and development that seeks to assess possible harms that technology might generate, and to try to change course to avoid potential undesirable effects. We believe computer science, and technology in general, should work for the benefit of society and this includes many factors, including giving the under-represented a voice. Our Women in AI & Ethics conference in 2019 sought to give a platform to women working in STEM, who are often in the extreme minority in their teams or departments. This isolation can be amplified if they are women of colour, and we wanted to provide an opportunity to connect and collaborate.
We understand that our search for ‘the best’ students and staff must take into account factors such as representation, and wider social impact, as well as academic excellence. To this end we are investigating ‘blind’ CV-sifting processes, as work has shown that this can have a significant impact on diversity.
We also understand that computer science itself has been characterised as a colonial system, exporting technology designed for particular cultural and social contexts into other areas of the globe, without regard to local needs or contexts. In particular, learning from our previous research, there is an urgent need to work on decolonising digital innovation, digital content, and digital data to explore how databases and images might support indigenous knowledge systems. To address this, we seek to incorporate perspectives from other areas of the world into our research and teaching, particularly within research groups such as Human Centred Computing.
Finally, we acknowledge the lessons that the events of 2020 have made so clear, not just in other countries but in the UK and here in Oxford. The University, and this Department within it, seek to be the best that we can be. This means making it clearer that we reject the conscious and unconscious biases of the past and that we seek a future for our work that incorporates a multitude of voices at every level.
The page has now been replaced with something more anodyne: archive copy here. (Thanks to Walid Sahibi for the link.)