Digital Humanism 2020ICT20-055

Algorithmic governance of care


Algorithmic governance of care
Principal Investigator:
Institution:
Project title:
Algorithmic governance of care
Collaborators:
Roger von Laufenberg (VICESSE) (Co-Principal Investigator)
Vera Gallistl (Karl Landsteiner Privatuniversität) (Co-Principal Investigator)
Status:
Ongoing (01.12.2021 – 30.11.2024)
Funding volume:
€ 429,940

 
Abstract:

Care work in long-term care (LTC) is considered as a genuine human-centred activity, requiring empathy, emotional investment, physical encounters and intimate, trust-based relations between various care-givers and care-recipients. Artificial Intelligence (AI) technologies are introduced in this professional field to assist care workers in their daily activities and provide an additional measure of care for clients. This has changed the provision of care, affecting care givers and recipients alike. So far, little research has been done on the biases that emerge from AI in this field and the risks that algorithmic governance of care offers in the profession. Based on data generated by AI technologies, unfair decisions can remain unnoticed in the process of linking different big data sets, leading to ethical and social issues in LTC. ALGOCARE’s goal is to understand the functionality and bias of algorithmic governing systems of care and their effects on care givers and recipients. Insight from qualitative case study research in LTC will provide an understanding of the impact and needs of care in relation to AI systems. The use-value of explainable AI (xAI) methods (trustworthiness, fairness, explainable procedures) and different levels of transparency that either the model itself provides or methods that provide them before or after model development are explored. Based on this insight, tools and metrics are developed to evaluate the explainability of AI for care.

 

We use cookies on our website. Some of them are technically necessary, while others help us to improve this website or provide additional functionalities. Further information