Why does AI research and development need more women?
Regina Ammicht Quinn: In the 1950s and 60s programming was women's work. This activity did not yet have the status it has today, the term "software" was not yet invented and "coding" was considered a secondary task, while fame and glory went to the manufacturers of the machines. Women were regarded as particularly qualified programmers, after all - so one of the arguments - they were also able to design knitting patterns. Many women laid the foundations for today's computer science, such as Mary Allen Wilkes, Grace Hopper or Fran Allen - to name but a few of the forgotten names. So the fact that there are few women in this field today has little to do with DNA or seemingly gender-specific talents, but rather with a social history that has yet to be traced in detail.
AI is a technology that shapes our society and, as an "enabler", also steers the decisive lines of development in central areas of society and technology such as medicine, education or mobility. AI research and development is therefore social development in which women should naturally participate in equal measure with men. At the same time, diversity in AI development means taking into account the needs and realities of different social groups. This is an important step in preventing the exclusion of certain groups from using relevant applications, limiting discrimination and developing AI for the benefit of all.
Does AI run the risk of consolidating existing gender roles?
Regina Ammicht Quinn: Yes, because AI depicts society with all the injustices that have existed until now. One example is the forecasting system of the Austrian Public Employment Service (AMS) discussed a few months ago. Since women are disadvantaged on the labour market structurally and due to prejudices, these data here become the basis for calculating the chances of women on the labour market in general. The poorer prospects of women calculated from these data may then result in poorer placement aids and also have a negative impact on the self-image of women seeking employment
What can be done about these distortions?
Regina Ammicht Quinn: Of course, the basic principle is that social reality must change so that forecasting systems can also count on "fair data". But the AI algorithms themselves may also have been designed in such a way that they lead to discrimination. To prevent this, diversity in development departments can be an important tool - but not the only one. To name just three more important points: To prevent discrimination, we need professional ethics for computer scientists, as already initiated by DADM (Discrimination-Aware Data-Mining) and FATML (Fairness, Accountability and Transparency in Machine Learning"). We also need standards for AI, which are carried out via audits, certificates and controls by public authorities for particularly relevant applications, e.g. in medicine or the judiciary. We also need to ensure regulated, transparent and easily accessible channels for appealing against decisions made using AI.