Zum Hauptinhalt springen
TU Graz/

Fairness in AI: Study Shows Central Role of Human Decision-Making


by Falko Schoklitsch published at 11.12.2025 Research

Fairness in AI: Study Shows Central Role of Human Decision-Making

In addition to helping in a practical way, recommendations based on AI should above all be fair. A new study by researchers at TU Graz, Uni Graz and the Know Center shows how this can be achieved.
A bicycle handlebar with a smartphone attached to it, displaying a navigation app on the screen.
The route suggested by AI depends heavily on human input during its creation. Image source: primipil – Adobe Stock

AI-supported recommender systems should provide users with the best possible suggestions for their enquiries. These systems often have to serve different target groups and take other stakeholders into account who also influence the machine’s response: e.g. service providers, municipalities or tourism associations. So how can a fair and transparent recommendation be achieved here? Researchers from Graz University of Technology (TU Graz), the University of Graz and Know Center investigated this using a cycling tour app from the Graz-based start-up Cyclebee. They conducted research into how the diversity of human needs can be taken into account by AI. The study, which was awarded a Mind the Gap research prize for gender and diversity by TU Graz, was funded by the Styrian Future Fund.

Impact on numerous groups

“AI-supported recommender systems can have a major influence on purchasing decisions or the development of guest and visitor numbers,” says Bernhard Wieser from the Institute of Human-Centred Computing at TU Graz. “They provide information on services or places worth visiting and should ideally take individual needs into account. However, there is a risk that certain groups or aspects are under-represented.” In this context, an important finding of the research was that the targeted fairness is a multi-stakeholder problem, as not only end users play a role, but also numerous other actors.

These include service providers such as hotels and restaurants along the routes and third parties such as municipalities and tourism organisations. And then there are stakeholders who don’t even come into contact with the app but are nevertheless affected, such as local residents who could feel the effects of overtourism. According to the study, reconciling all these stakeholders cannot be solved with technology alone. “If the app is to deliver the fairest possible results for everyone, the fairness goals must be clearly defined in advance. And that is a very human process that starts with deciding which target group to serve,” says Bernhard Wieser.

Involving all actors in the design

This target group decision influences the selection of the AI training data, its weighting and further steps in the algorithm design. In order to involve the other stakeholders as well, the researchers propose the use of participatory design, in which all actors are involved in order to harmonise their ideas as well as possible. “Ultimately, however, you have to decide in favour of something, so it’s up to the individual,” says Dominik Kowald from the Fair AI group at the Know Center research centre and the Department of Digital Humanities at the University of Graz. “Not everything can be optimised at the same time with an AI model. There is always a trade-off.”

Ultimately, it is up to the developers to decide what this trade-off looks like, but according to the researchers, it is important for end users and providers that there is transparency. Users want to be able to adapt or influence the recommendations, and providers want to know the rules according to which routes have been set or providers ranked. “Our study results are intended to support software developers in their work in the form of design guidelines, and we also want to provide guidelines for political decision-makers,” says Bernhard Wieser. “It is important that we make recommender systems increasingly available to smaller, regional players thanks to technological developments. This would make it possible to develop fair solutions and thus create counter-models to multinational corporations, which would sustainably strengthen regional value creation.”

Study: Multistakeholder Fairness in Tourism: What Can Algorithms Learn from Tourism Management?
Authors: Peter Müllner, Anna Schreuer, Simone Kopeinik, Bernhard Wieser and Dominik Kowald (2025).
In: Frontiers in Big Data
DOI: https://doi.org/10.3389/fdata.2025.1632766

Kontakt

Bernhard WIESER
Assoc.Prof. Mag.phil. Dr.phil.
TU Graz | Institute of Human-Centred Computing
Phone: +43 316 873 30661
bernhard.wiesernoSpam@tugraz.at

Dominik KOWALD
Univ.-Prof. Dipl.-Ing. Dr.techn. BSc
Know Center Research GmbH and University of Graz | Department of Digital Humanities
Phone: + 43 664 619 1718
dkowaldnoSpam@know-center.at