Bachelor and Master Theses

To apply for conducting this thesis, please contact the thesis supervisor(s).
Title: Visualisation and Data-Driven Explainability in Graph-Based AI Workflows.
Subject: Computer science, Robotics, Applied Artificial Intelligence, Software engineering
Level: Basic, Advanced
Description:

Background: Industrial artificial intelligence (AI) applications work with a variety of data types, including sensor signals, images, and textual reports. To make these data-driven models more understandable, it’s essential to not only utilise effective Explainable AI (XAI) techniques but also implement visualisation strategies that minimise cognitive load and enhance users' ability to form mental models. MainlyAI stands out with its graph-based interactive workflows, providing a unique way to present complex explanations in a visual and intuitive manner. This thesis explores how visualisation, combined with the unique characteristics of different data types and a system-level approach to demonstrating XAI techniques, can significantly enhance user understanding and build trust in industrial environments.

Scope: The thesis focuses on visualisation strategies, data type considerations, and the demonstration of XAI theory within the MainlyAI platform. It will use industrial datasets as test cases to study how different data types affect explanation design and how MainlyAI can be extended to communicate both practical outputs and theoretical underpinnings of XAI methods.

Goals:

· Optimise data visualisation within MainlyAI to reduce cognitive load and support accurate mental model formation.

· Assess the influence of different data types (sensor, image, text) on explanation efficacy and demonstrate the utility of MainlyAI for communicating XAI concepts.

Task Description:

· Conduct a literature review on visualisation techniques, cognitive load theory, and multimodal XAI.

· Study the visualisation and markup features of MainlyAI.

· Select industrial datasets covering different data types (sensor readings, process images, textual logs).

· Implement ML models for these datasets inside MainlyAI.

· Design visualisation patterns and interactive controls (e.g., layered graphs, what-if panels, annotated nodes).

· Evaluate explanation effectiveness across different data types with representative users.

· Analyse results focusing on mental model accuracy and cognitive load.

· Demonstrate how MainlyAI can illustrate the theoretical underpinnings of XAI methods (e.g., SHAP, counterfactuals).

· Produce guidelines for visualisation and data-driven explanation design in graph-based XAI systems.

· Deliver a thesis report with system-level contributions and recommendations for MainlyAI’s evolution.

 

Start date: 2026-01-29
End date: 2026-06-26
Prerequisites:

AI & Machine Learning Expertise Good to have

  • Deep Learning & ML Techniques: Experience with neural networks (e.g., LSTMs, CNNs) and ensemble models (e.g., XGBoost, Random Forest) for predictive modeling.
  • Feature Engineering: Ability to extract and preprocess relevant features from sensor data and simulations.
  • Model Validation & Optimization: Techniques for hyperparameter tuning, cross-validation, and performance evaluation.
  • XAI methods (e.g., SHAP, LIME counterfactuals).

Software & Tools Good to have

  • Programming Languages: Python for model development.
  • Machine Learning Frameworks: TensorFlow, PyTorch, Scikit-learn for training AI models.
  • Data Processing Tools: Pandas, NumPy, and SciPy for data handling and analysis.

 



 

IDT supervisors: Shahina Begum
Examiner: Mobyen Uddin Ahmed
Comments:
Company contact:

MainlyAI