Bachelor and Master Theses

To apply for conducting this thesis, please contact the thesis supervisor(s).
Title: Explainable AI- Take part in creating the future of human-system interaction
Subject: Computer science, Embedded systems, Robotics, Software engineering
Level: Basic, Advanced
Description:

 

Background

In industrial settings, AI holds the potential for significant improvements such as energy-efficient operations, increasing throughput, and enabling more sustainable operations. According to a Gartner research study, only 15% of AI solutions deployed by 2022 will be successful. Despite a model’s statistical performance, process experts often don’t trust a ML model whose inner reasoning is not clear to them. If deployed, the end-users further struggle to comprehend the outputs and actions of the ML models.

To realize the possible benefits of AI in industrial applications, domain experts and end-users need to be given insights in the internal reasoning of ML(Machine Learning)-models. This is the subject of XAI (Explainable AI) research which gained a lot of attention in the last years. However, the needs and characteristics of the users and their working context is very diverse in industrial applications and these aspects need to be considered when designing industrial AI systems. Hence, it is imperative to consider diverse industrial users, use-cases, and data to develop a better understanding of the context and requirement of appropriate AI solutions.

Research questions

Using the fields study material gathered by us as a foundation to understand the industrial context of mining or pulp and paper, the following areas are proposed for the thesis student to dive deeper into:

·       How do different types of explanations affect end-user understanding of an ML model?

·       How and when should feedback be collected from an end-user?

·       How can data be visualized to help end-users build accurate mental models and reduce cognitive load?

·       What effect do data types have on understanding explanations?

 

Deliverables

Deliverables can be vary depending on either Human Factors, HCI or Computer science background. Examples of desirable deliverable are, Research methods for XAI to harvest inputs (needs, requirements), personas, user experience maps, demos / prototypes (coded is desirable), Interaction flows, validation methods for XAI technologies. Learnings from your work in form of recommendations or guidelines will also be beneficial for your master’s thesis and for ABB.  

Start date: 2023-01-16
End date: 2023-06-30
Prerequisites:

Knowledge in AI/ML, Python programming, data visualization

 

IDT supervisors: Mobyen Uddin Ahmed
Examiner: Shahina Begum
Comments:
Company contact:

User Experience Research Team

Automation Technologies Department

 

ABB AB, Corporate Research