Bachelor and Master Theses

To apply for conducting this thesis, please contact the thesis supervisor(s).
Title: Interactive Graph-Based Explainable AI for Industrial Applications
Subject: Computer science, Robotics, Software engineering, Applied Artificial Intelligence
Level: Basic, Advanced
Description:

Background: Artificial intelligence (AI) is transforming various industrial sectors, including mining and the pulp and paper industry, by enhancing efficiency and promoting sustainability. One of the significant challenges, however, is the lack of transparency associated with AI models, which often leads to doubt from industry professionals. This is where Explainable AI (XAI) comes into play. Its goal is to make AI model behaviour more understandable and interpretable.

While there are plenty of generic techniques for XAI out there, the adaptation of these methods to fit the workflows of industrial experts is still quite limited. This is a vital gap that needs addressing. Platforms like MainlyAI offer a unique approach by providing interactive environments where users can visualise explanations and seamlessly integrate their feedback into their workflows. This creates a more trustworthy and practical solution for bridging the gap between advanced AI technologies and the expertise of industry professionals.

Scope: This thesis focuses on studying how different explanation modalities can be presented in an interactive graph environment and how user feedback can be effectively integrated. The scope includes industrial use cases such as predictive maintenance, with XAI methods implemented and evaluated inside MainlyAI.

Goals:

· Evaluate the impact of explanation modalities (feature importance, counterfactuals, rule-based explanations) on user comprehension and trust when implemented in interactive graph workflows.

· Design and assess user feedback mechanisms within graph-based XAI systems to refine explanations and improve usability.

Task Description:

· Conduct a literature review on explainable AI methods and human factors in industrial AI.

· Study MainlyAI’s architecture and capabilities for interactive workflows.

· Select 1–2 industrial use cases (e.g., predictive maintenance, process optimisation).

· Implement ML models for selected use cases inside MainlyAI.

· Integrate multiple explanation modalities (feature importance, counterfactuals, rule-based).

· Design feedback mechanisms (e.g., sliders, annotation nodes, survey prompts) within the MainlyAI workflow.

· Conduct user evaluations with representative participants (e.g., engineers, process experts, or proxies).

· Analyse results focusing on comprehension, trust, and the refinement of explanations.

· Deliver a thesis report with validated guidelines for interaction and feedback in graph-based XAI.

 

Start date: 2026-01-29
End date: 2026-06-26
Prerequisites:

AI & Machine Learning Expertise Good to have

  • Deep Learning & ML Techniques: Experience with neural networks (e.g., LSTMs, CNNs) and ensemble models (e.g., XGBoost, Random Forest) for predictive modeling.
  • Feature Engineering: Ability to extract and preprocess relevant features from sensor data and simulations.
  • Model Validation & Optimization: Techniques for hyperparameter tuning, cross-validation, and performance evaluation.
  • XAI methods (e.g., SHAP, LIME counterfactuals).

 

Software & Tools Good to have

  • Programming Languages: Python for model development.
  • Machine Learning Frameworks: TensorFlow, PyTorch, Scikit-learn for training AI models.
  • Data Processing Tools: Pandas, NumPy, and SciPy for data handling and analysis.




 

IDT supervisors: Shahina Begum
Examiner: Mobyen Uddin Ahmed
Comments:
Company contact:

MainlyAI