Bachelor and Master Theses

To apply for conducting this thesis, please contact the thesis supervisor(s).
Title: Analyzing Communication Efficiency in Federated Learning for Network-Based Intrusion Detection
Subject: Computer network engineering
Level: Basic
Description:

Background:

 

Smart-grid and IoT infrastructures rely on distributed, networked embedded devices that must process data locally while ensuring privacy and reliability. Federated Learning (FL) allows these devices to collaboratively train a shared machine-learning model without transmitting raw data to a central server.
While privacy is preserved, FL introduces communication overhead—each client repeatedly exchanges model updates with a server. For embedded and edge systems, this additional traffic can affect latency, scalability, and energy use. Understanding these trade-offs is critical when deploying FL in cybersecurity domains such as network-based intrusion detection.

Aim:

 

To implement a simple federated-learning setup for intrusion detection and evaluate its communication efficiency, network load, and convergence performance across different configurations.

Main Tasks:

 

  • Implement a federated-learning environment using Flower or TensorFlow Federated (TFF).
  • Use a public intrusion-detection dataset to train a small model (e.g., MLP or CNN) within the federated setup.
  • Measure key performance indicators:
    • Number of communication rounds to reach target accuracy
    • Total bytes exchanged between clients and server
    • Latency and runtime per round
    • (Optional) CPU or memory usage per client
  • Analyze results (e.g., accuracy vs rounds, data vs performance).

 

Possible Extensions (Optional for a team of two student):

 

  • Compare FedAvg and FedProx aggregation algorithms.
  • Evaluate IID vs Non-IID data distributions.
  • Study the effect of model size on communication cost.
  • Create a small visualization dashboard (e.g., Streamlit) for real-time monitoring.
Start date:
End date:
Prerequisites:

 

  • Skills: basic Python programming; interest in distributed systems and network performance
  • Familiarity with ML frameworks (e.g., TensorFlow)
  • No deep-learning expertise required—existing frameworks will be used.
IDT supervisors: Maryam Vahabi
Examiner: Hossein Fotouhi
Comments:
Company contact: