Fair Algorithms for All
Let’s start at the beginning: What are human computer interfaces and inclusive technologies?
Elisabeth Lex: Algorithms and artificial intelligence work very well in many areas today, but mainly for people with mainstream needs. My field of research is about designing these systems in such a way that they also cater for people with special or less common needs. We develop technologies that are truly inclusive and adapt to the people who use them – and not the other way round.
Why is this necessary? Why don’t these systems work equally well for everyone?
Lex: There are various reasons for this. Often there is simply too little data or too little differentiated data on certain groups of people. As a result, their needs remain invisible. This is not necessarily intentional, but is often due to the fact that these groups were not taken into account in development processes or that data sets are not created in a very diverse way.
To give a concrete example, is this about large language models, which are frequently being discussed at the moment?
Lex: Yes, also. It is important to me that LLMs become tools that really support people with very different needs. To achieve this, information must be accessible without barriers. I think recommender systems are also very important. We see many forms of bias there. They work less well for some genders or age groups and do not reflect all realities of life equally well. This is evident in something as mundane as music recommendations. As soon as I listen to something outside the mainstream, the recommendations get significantly worse.
In our current research, we are investigating whether recommender systems can map the reality of life for people with different needs at all – and whether the necessary information is contained in the training data. In a recent study, for example, we analysed point-of-interest recommenders that make recommendations for places of interest or restaurants. We specifically searched for places with barrier-free access and discovered a strong urban-rural divide. There is much more data and much more differentiated data in cities than in rural areas, and the recommendations are correspondingly better.
So far, we have analysed data from the USA and want to extend the analysis to Europe. A comparative study is particularly interesting here because many cities have grown historically and accessibility was often not planned from the outset.
But if this information is not included in the data, what possibilities do you have to influence it?
Lex: The first step is to make these problems transparent and to involve the affected group. To take the example of the point-of-interest recommender – there are great initiatives such as Wheelmap, an online map to mark and find wheelchair-accessible places. Anyone can contribute to generating data on the topic of wheelchair accessibility. It is also promising to merge previously separate data sources and to develop systems truly together with different stakeholders.
The Professorship of Human Computer Interfaces and Inclusive Technologies is financially supported by the Archduke Johann Society’s initiative for children and adolescents with a disability. The endowed professorship is intended to bring inclusion and technology even closer together.
Why are you so interested in this topic?
Lex: I’ve always been very interested in how to make access to information easier – because that’s what makes us strong as a society. Especially for people with disabilities or special needs, access to the huge amount of available information is often not well organised. They are excluded by this, and I want to change that.
I also find it extremely exciting to see how people acquire knowledge, learn and develop. Recommender systems are also about suggesting learning resources, sources or people. From a scientific perspective, I am interested in the psychological processes behind learning and how people with learning difficulties, for example, can be given targeted support.
My research is always about having a positive influence on social processes. Making machines smarter definitely has its appeal, but at the end of the day it’s people who work on these machines – and my work is intended to help them.
Do you have a connection to psychology?
Lex: Many of the algorithms I work on are inspired by psychological or sociological models. They help us to better understand learning behaviour, individual behaviour and different needs. This enables us to develop systems that are truly people-orientated.
Where has your academic career taken you so far?
Lex: I studied telematics – which is at the interface between electrical engineering and computer science – at TU Graz. After graduating, I initially worked in the private sector, but quickly realised that this was not enough for me. That’s why I decided to do a doctorate and to focus on access to information via visual and text-based technologies.
I then went to Argentina as a postdoc, where it was particularly exciting to carry out research with very limited technical resources. As a postdoc at the RWTH Aachen University, I started to work on the topic of bias and fairness in algorithms. The topic fascinated me – especially the idea of looking at an algorithm not just as a technical artefact, but embedded in a social context.
I eventually returned to TU Graz, completed my habilitation there and have now taken on the professorship. TU Graz offers an ideal combination of technical excellence, interdisciplinary openness and a rapidly growing community in the field of human-centred AI. There are teams here who are enthusiastic about inclusive technologies and an infrastructure that enables both basic research and concrete applications.
Kontakt
Elisabeth LEX
Institute of Human-Centred Computing
Sandgasse 36/III
8010 Graz
Phone: +43 316 873 30640
elisabeth.lex@tugraz.at

