Presentation Schedule June 10, 2019

Room 1 • 08:15-12:20

Examination Committee: Thomas Nolte, Kristina Lundqvist, Jan Carlson, Aida Causevic
Room: Lambda


08:15 - 09:00

DVA423

Applicability study of software architectures in the discreet manufacturing domain

Ermal Bizhuta, Dhespina Carhoshi
Advisor: Mohammad Ashjaei, Séverine Sentilles
Examiner: Thomas Nolte

Abstract: Manufacturing, under the umbrella of the latest industrial revolution, has gone through enormous changes in the last decades to then later evolve in what we know now as smart manufacturing. Different companies and entities have developed their own versions of architectures for intelligent and digitalized manufacturing systems. Ideating a flexible and safe architecture is one of the first steps towards a system that intends to be applicable in different environments, regardless of the vast variety of possibilities available. For this purpose, the following thesis presents an investigation on the state-of-the-art solutions of the most recent digitalized cloud-based system architectures in the domain of discreet manufacturing. Based on an initial system architecture conceived from the company ABB, an evaluation of this architecture was conducted, by taking in consideration the existing systematical approaches to the digitalization of this industry. In the following thesis work, we investigate, describe and evaluate the limitations and strengths of the most recent and known architectural approaches to cloud robotics. Finally, a few key remarks are made towards ABB's initial solution but also to the industry in general.


09:05 - 09:50

DVA423

Test script design approaches supporting reusability, maintainability and review process

Aleksandar Acimovic, Aleksandar Bajceta
Advisor: Adnan Causevic
Examiner: Kristina Lundqvist

Abstract: Software testing is widely considered to be one of the most important parts of software development life-cycle. In this research, we investigated potential improvements in the testing process and design of automated test scripts inside Bombardier Transportation. For the creation of automated test scripts BT is using a group of programs called TAF (Test Automation Framework). These scripts are used for testing Train Control Management System (TCMS), software that is used for managing the train. TAF can export its test scripts in XML format. XML scripts are analyzed in order to identify the most frequent changes. To better understand the life cycle of automated Test scripts official documentation that defines the Verification and Validation process inside BT was analyzed. Also, an interview was conducted with one of the responsible persons for testing. We believe that we have found a possible solution for improving testing process and creation of automated test scripts in BT, and to evaluate it proof of concept tool was developed. The main idea behind the tool is to write the test script using keywords which are based on analysis that was conducted on test specification documentation. These keywords represent frequent actions that are being tested on the train. By storing those actions in keywords re-usability of test script is being increased. Also, because they are based on naturally language, they are having positive effect on readability and maintenance of the test script.


09:55 - 10:40

DVA423

MEASURING COMPLEXITY OF NATURAL LANGUAGE REQUIREMENTS IN INDUSTRIAL CONTROL SYSTEMS

Kostadin Rajkovic
Advisor: Eduard Einou, Antonio Cicchetti
Examiner: Jan Carlson

Abstract: "Requirements specification documents are one of the main sources of guidance in software engineering projects and they contribute to the definition of the final product and its attributes. They can often contain text, graphs, figures and diagrams. However, they are still mostly written in Natural Language (NL) in industry, which is also a convenient way of representing them. With the increase in the size of software projects in industrial systems, the requirements specification documents are often growing in size and complexity, that could result in requirements documents being not easy to analyze. There is a need to provide the stakeholders with a way of analyzing requirements in order to develop software projects more efficiently.

In this thesis we investigate how the complexity of textual requirements can be measured in industrial systems. A set of requirements complexity measures was selected from the literature. These measures are adapted for application on real-world requirements specification documents. These measures are implemented in a tool called RCM and evaluated on requirements documentation provided by Bombardier Transportation AB. The statistical correlation between the selected measures was investigated based on a sample of data from the provided documentation. The statistical analysis has shown a significant correlation between a couple of selected measures. In addition, a focus group was performed with a goal of exploring the potential use of these metrics and the RCM tool in industrial systems as well as what different areas of potential improvement future research can investigate."


10:45 - 11:30

DVA424

RoboRebeca:A New Framework To design Verified ROS-Based Robotic Programs

Saeid Dehnavi
Advisor: Marjan Sirjani, Ali Seddaghatbaf
Examiner: Jan Carlson

Abstract: Robotic technology helps humans in different areas such as manufacturing, health care and education. Due to the ubiquitous revolution, today's focus is on mobile robots and their applications in a variety of cyber-physical systems. There are several powerful robot middlewares, such as ROS and YARP to manage the complexity of robotic software implementation. However, they do not provide support for assuring important properties such as timeliness and safety. We believe that integrating model checking with a robot middleware helps developers design and implement high quality robotic software. By defining a general conceptual model for robotic programs, in this thesis we present an integration of Timed Rebeca modeling language (and its model checker) with ROS to automatically synthesize verified ROS-based robotic software. For this integration, first the conceptual model is mapped to a Timed Rebeca model which is used to verify the desired properties on the model. The Timed Rebeca model may be modified several times until the properties are satisfied. Finally, the verified Timed Rebeca model is translated to a ROS program based on a set of mapping rules. Conducting experiments on some small-scale case studies indicate the usefulness and applicability of the proposed integration method.


11:35 - 12:20

DVA501

Systematic literature review of safety-related challenges for autonomous systems in safety-critical applications

Milos Ojdanic
Advisor: Elena Lisova, Irfan Sljivo
Examiner: Aida Causevic

Abstract: An increased focus on the development of autonomous safety-critical systems requires more attention at ensuring safety of humans and the environment. The main objective of this thesis is to explore the state of the art and to identify the safety-related challenges being addressed for using autonomy in safety-critical systems. In particular, the thesis explores the nature of these challenges, the different autonomy levels they address and the type of safety measures as proposed solutions. Above all, we focus on the safety measures by a degree of adaptiveness, time of being active and their ability of decision making. Collection of this information is performed by conducting a Systematic Literature Review of publications from the past 9 years. The results showed an increase in publications addressing challenges related to the use of autonomy in safety-critical systems. We managed to identify four high-level classes of safety challenges. The results also indicate that the focus of research was on finding solutions for challenges related to full autonomous systems as well as solutions that are independent of the level of autonomy. Furthermore, considering the amount of publications, results show that non-learning solutions addressing the identified safety challenges prevail over learning ones, active over passive solutions and decisive over supportive solutions

12:25 - 13:10 LUNCH BREAK

Room 1 • 13:15-17:20

Examination Committee: Marjan Sirjani, Jan Carlson, Antonio Cicchetti
Room: Lambda


13:15 - 14:00

DVA501

Mapping UML diagrams to the Reactive Object Language (Rebeca)

Vladimir Djukanovic
Advisor: Antonio Cicchetti
Examiner: Marjan Sirjani

Abstract: Unified Modeling Language (UML) is a de-facto standard modeling language and has been used for years in various industrial domains. It is a general-purpose language with an extensive syntax and notations that can be used to model a system of any kind. However, its semantics are under-specified and too broad which leaves a room for different interpretations. This hinders the ability to perform formal verification of produced models and introduces the need for stricter and more rigorous specification. With that in mind, it is usually more suitable to map the UML models to other domains, where modeling concepts have stricter semantics. Notably, Reactive Objects Language (Rebeca) is an actor-based language with a formal foundation and formal verification support. This work aims to bridge this gap in UML by proposing a comprehensive mapping procedure between UML concepts and Rebeca concepts, thus enabling a formalization of a subset of UML used for mapping. In particular, we investigate Rebeca semantics, by extracting them from selected examples, and for each of them, we provide the corresponding UML semantics, as part of an iterative process. This process ends when all Rebeca semantics are exhausted and comprehensive mapping procedure emerges. Additionally, validation is an important part of this thesis as it aims to establish confidence in the developed mapping procedure (in post-conversion validation) and avoid performing the transformation if the design is not compatible with the mapping procedure (in pre-conversion validation). As part of the pre-conversion validation, in order to establish the compatibility with the mapping procedure, we provide an extensive list of correctness attributes. As part of the post-conversion validation, the mapping procedure is validated by transformation on the provided examples. The results of this transformation show the wide range applicability of the mapping procedure and serve as an assertion of its comprehensiveness.


14:05 - 14:50

DVA501

Automated Synthesis of Model Comparison Benchmarks

Lorenzo Addazi
Advisor: Antonio Cicchetti
Examiner: Jan Carlson

Abstract: Model-driven engineering promotes the migration from code-centric to model-based software development. Systems consist of model collections integrating different concerns and perspectives, while semi-automated model transformations generate executable code combining the information from these. Increasing the abstraction level to models required appropriate management technologies supporting the various software development activities. Among these, model comparison represents one of the most challenging tasks and plays an essential role in various modelling activities. Its hardness led researchers to propose a multitude of approaches adopting different approximation strategies and exploiting specific knowledge of the involved models. However, almost no support is provided for their evaluation against specific scenarios and modelling practices. This thesis presents Benji, a framework for the automated generation of model comparison benchmarks. Given a set of differences and an initial model, users generate models resulting from the application of the first on the latter. Differences consist of preconditions, actions and postconditions expressed using a dedicated specification language. The generator converts benchmark specifications to design-space exploration problems and produces the final solutions along with a model-based description of their differences with respect to the initial model. A set of representative use cases is used to evaluate the framework against its design principles, which resemble the essential properties expected from model comparison benchmark generators.


14:55 - 15:40

DVA501

A Model-Driven Engineering approach for modeling Heterogeneous Embedded Systems

Vincenzo Stoico
Advisor: Federico Ciccozzi
Examiner: Jan Carlson

Abstract: Demands of high-performance systems guided the designers to the assessment of heterogeneous embedded systems (HES). Their complexity highlighted the need for methodologies and tools to ease their design. Model-Driven Engineering (MDE) can be crucial to facilitate the design of such a system. Research has demonstrated the usage of MDE to create platform-specific models (PSM). The aim of this work is to support HES design targeting platform-agnostic models. This work is based on a well-defined use case. It comprises a software application, written following the CUDA programming model, executing on a CPU-GPU hardware platform. The use case is analyzed to define the main characteristics of a HES. These concerns are included in a UML profile used to capture all the features of a HES. The profile is built as an extension of MARTE modeling language. Finally, the Alf action language is applied to make the model executable. The results prove the suitability of MARTE and Alf to create executable HES models. Additional research is needed to further investigate the HES domain. Finally, it is necessary to prove the validity of the UML profile targeting different programming models and hardware platforms.


15:45 - 16:30

DVA501

Constraints for avoiding SysML model inconsistencies

Cristian Capozucco
Advisor: Federico Ciccozzi, Jan Carlson
Examiner: Antonio Cicchetti

Abstract: "Models are used in multiple phases of a development process for several purposes. However, models may present inconsistencies. This is often due to the modelling language leaving a certain degree of freedom to the modeller or the tool, implementing the language and used by the modeller, not providing enough support for identifying possible inconsistencies. This thesis identifies a certain number of possible modelling situations, defined with the System Modeling Language, that can lead to inconsistencies and studies the causes of them, more specifically if they are caused by the language specification or by how the specification is implemented in modelling tools. Moreover, we provide automatic validation checks to identify those inconsistencies by means of constraints defined with the Epsilon Validation Language. The results of this thesis are useful for modellers since it helps them in automatically identifying inconsistencies in models and in that way benefitting the most from modelling activities."


16:35 - 17:20

DVA424

Automating Integration-Level Test Case Generation for Object-Oriented .Net Applications

Mehdi Qorbanpur
Advisor: Mehrdad Saadatmand
Examiner: Antonio Cicchetti

Abstract: While many tools have been created for automating the Unit Testing in industry, the Integration Testing automation, because of its complexity, has always been a challenge in software engineering. In recent years, although some industrial tools have been introduced in this context, but none of them have been about automatic Test Case Generation at integration level. By the emergence of new distributed development environments and agile methodologies in recent years, the process of software development has considerably speeded-up, and as a consequence, the concept of DevOps including continuous integration and continuous delivery (CI/CD) has become important more and more. As a result, integration level testing has been getting software specialists’ attention more than before. In 2018, based on Data Flow analysis techniques, IntegrationDistiller [1] was introduced as an automated solution and tool to identify integration scenarios and generate test cases for .NET applications, using Roslyn C# Compiler APIs [2]. In this paper, after re-implementing the solution together with some improvements on the analysis algorithm, the validity of this approach was assessed by examining a couple of C# projects as benchmarks, and confronting the results to the Integration-Level Mutation Operators for Object-Oriented applications used in jMINT [3]. According to the reviewed literature, based on Coupling-based Analysis, and applicable Roslyn features, some future work is suggested at the end.

 

Room 2 • 09:15-11:40

Examination Committee: Ning Xiong, Alessandro Papadopoulos
Room: Kappa


09:15 - 10:00

DVA502

Machine learning for mechanical analysis

Sebastian Bengtsson
Advisor: Martin Ekström
Examiner: Ning Xiong

Abstract: It is not reliable to depend on a persons inference on dense data of high dimensionality on a daily basis. A person will grow tired and make mistakes over time. Therefore it is desirable to study the feasibility of replacing a persons inference with that of Machine Learning. Support Vector Machines (SVM) are implemented and tested for Anomaly Detection and classification and compared to the performance of Back-Propagation Neural Networks. Principal Component Analysis and Autoencoders are used with the intention to increase performance. One-Class SVMs proved very effective in detecting anomalous samples. SVMs were used for multiclass classification using the 1vsAll and 1vs1 approaches, producing promising results.


10:05 - 10:50

DVA502

Development of a robust cascade controller for a riderless bicycle

Tom Andersson, Niklas Persson
Advisor: Anas Fattouh, Martin Ekström
Examiner: Alessandro Papadopoulos

Abstract: A controlled riderless bicycle is desired for the purpose of testing autonomous vehicles ability to detect and recognise cyclists. The bicycle, which is a highly unstable system with complex dynamics have been subject to research for over a century, and in the last decades, controllers have been developed for autonomous bicycles. The controllers are often only evaluated in simulation, but some complex controllers have been developed on real-life bicycles as well. The goal of this work is to validate sensors and subsystems of an instrumented bicycle and to develop a robust controller which can balance a bicycle by using actuation on the steering axis alone. Using an iterative design process, the sensor measuring the lean angle and the steering system are improved and validated. By sensing the lean angle, the handlebar is manipulated to make the bicycle stable. For this purpose, a P, PD, two different PID, an LQR and a fuzzy controller are developed, evaluated and compared. The results show that the bicycle can ride without human interaction on a bicycle roller in different velocities. Additionally, numerous experiments are conducted in an outdoor environment in several different terrains, where the proposed control structure manages to balance and steer the bicycle.


10:55 - 11:40

DVA502

Waveform clustering - Grouping similar power system events

Therése Eriksson, Mohamed Mahmoud
Advisor: Elaine Åstrand, Joaquín Ballesteros
Examiner: Ning Xiong

Abstract: Over the last decade, data has become a highly valuable resource. Electrical power grids deal with large quantities of data, and continuously collect this for analytical purposes. Anomalies that occur within this data is important to identify since they could cause nonoptimal performance within the substations, or in worse cases damage to the substations themselves. However, large datasets in the order of millions are hard or even impossible to gain a reasonable overview of the data manually. When collecting data from electrical power grids, predefined triggering criteria are often used to indicate that an event has occurred within the specific system. This makes it difficult to search for events that are unknown to the operator of the deployed acquisition system. Clustering, an unsupervised machine learning method, can be utilised for fault prediction within systems generating large amounts of multivariate time-series data without labels and can group data more efficiently and without the bias of a human operator. A large number of clustering techniques exist, as well as methods for extracting information from the data itself, and identification of these was of utmost importance. This thesis work presents a study of the methods involved in the creation of such a clustering system which is suitable for the specific type of data. The objective of the study was to identify methods that enables finding the underlying structures of the data and cluster the data based on these. The signals were split into multiple frequency sub-bands and from these features could be extracted and evaluated. Using suitable combinations of features the data was clustered with two different clustering algorithms, CLARA and CLARANS, and evaluated with established quality analysis methods. The results indicate that CLARA performed overall best on all the tested feature sets. The formed clusters hold valuable information such as indications of unknown events within the system, and if similar events are clustered together this can assist a human operator further to investigate the importance of the clusters themselves. A further conclusion from the results is that research into the use of more optimised clustering algorithms is necessary so that expansion into larger datasets can be considered.

 

Room 3 • 13:15-16:40

Examination Committee: Mobyen Uddin Ahmed, Björn Lisper, Moris Behnam, Jan Gustafsson
Room: Case


13:15 - 13:45

DVA331

Non-contact based Smart Mirror to Monitor Person Sleepiness using Heart Rate Variability

Fanny Danielsson
Advisor: Hamidur Rahman
Examiner: Mobyen Uddin Ahmed

Abstract: Today many strategies of monitoring health status and well-being are done through measurement methods that are connected to the body, e.g. sensors or electrodes. These are often complicated and requires personal assistance in order to use. This paper proposes a new method of making it possible for a user to self-monitoring their well-being and health status over time using a non-contact camera system. By examining a user's physiological parameters, one can extract measurements that can be used in order to monitor an individual's well-being. This paper focuses on measuring sleepiness with physiological parameters that are extracted via a non-contact camera system and based upon features of heart rate variability (HRV). The HRV metrics included and tested in this paper is SDNN, RMSSD, NN50 and pNN50 from Time Domain and VLF, LF and LFHFRatio from Frequency Domain. Machine Learning classification is done in order to measure an individual's sleepiness from the given features. The Machine Learning classification models which gave the best results, in forms of accuracy, were Logistic Regression and Support Vector Machines. The highest accuracy achieved was 91,67% for the training set and 83,33% for the test set. This paper has great potential for personal health care monitoring and can be further extended to detect other factors that could help a user to monitor their well-being, such as measuring stress level.

Link to the thesis: http://www.idt.mdh.se/examensarbete/index.php?choice=show&lang=en&id=2148


13:50 - 14:20

DVA331

Contact-free Cognitive Load Classification based of Psycho-Physiological Parameters

Johannes Sörman, Rikard Gestlöf
Advisor: Hamidur Rahman
Examiner: Mobyen Uddin Ahmed

Abstract: Historically speaking the best measure of CL is EEG, but EEG have a disadvantage which is that wires needs to be connected to the head of the test person. This can be a problem because it might affect the results of the research in situations where the participants need to move their bodyparts. That is why the goal of our research is to test the performance of a contact-free camera-based system in detecting CL. If the camera system would prove to be as good or better than a contact-based system, it could open new possabilities when conducting experiments out in the field. This because the camera system don´t have to be attached to a participant and therefore won’t be a distraction while performing the experiments. In order to determine which system that is better at measuring different levels of CL, controlled experiments were conducted in order to use the systems. The controlled experiments were done by completing four puzzles and playing two courses of Mario Kart Double Dash. To have a reference point, the participants also needed to sit normal for a period of time. It was concluded that KNN and SVM were the best algorithms for the collected data. In order to improve the classification accuracy for the collected data, different kernelfunctions were chosen for specific sets of the data. The results of this research proved that the algorithms achieved a higher classification accuracy for the data collected with the camera-based system than the shimmer sense system. The highest mean classification accuracy was 81% on binary classification S0-S2 collected by the camera while driving using the ML algorithm Fine KNN.


14:25 - 14:55

DVA331

Implementation of Data Parallel Primitives on MIMD Shared Memory Systems

Christian Mortensen
Advisor: Daniel Hedin
Examiner: Björn Lisper

Abstract: This thesis presents an implementation of a multi-threaded C library for performing data parallel computations on MIMD shared memory systems, with support for user defined operators and one-dimensional sparse arrays. Multi-threaded parallel execution was achieved by the use of the POSIX threads, and the library exposes several functions for performing data parallel computations directly on arrays. The implemented functions were based on a set of primitives that many data parallel programming languages have in common. The individual scalability of the primitives varied greatly, with most of them only gaining a significant speedup when executed on two cores followed by a significant drop-off in speedup as more cores were added. An exception to this was the reduction primitive however, which managed to achieve near optimal speedup in most tests. The library proved unviable for expressing algorithms requiring more then one or two primitives in sequence due to the overhead that each of them cause.


15:00 - 15:30

DVA331

Load Balancing of Parallel Tasks using Memory Bandwidth Restrictions

Tommy Ernsund, Linus Sens Ingels
Advisor: Jakob Danielsson
Examiner: Moris Behnam

Abstract: "Shared resource contention is a significant problem in multi-core systems and can have a negative impact on the system. Memory contention occurs when the different cores in a processor access the same memory resource, resulting in a conflict. It is possible to limit memory contention through resource reservation where a part of the system or an application is reserved a partition of the shared resource. We investigated how applying memory bandwidth restrictions using MemGuard can aid in synchronizing execution times of parallel tasks. We further investigate when memory bandwidth restrictions are applicable. We conduct three experiments to investigate when bandwidth restrictions are applicable. Firstly, we conducted an experiment to pinpoint when the memory bandwidth saturates a core. Secondly, we investigated our adaptive memory partitioning scheme performance against static and no partitioning. Finally, we tested how our adaptive partitioning scheme and static partitioning can isolate a workload against an interfering memory intensive workload running on a separate core. As the experiments only were conducted on one system, pinpointing the general point of contention was difficult, seeing that it can differ significantly from system to system. Through our experiments, we see that memory bandwidth partitioning has the ability to decrease the execution time of feature detection algorithms, which means that memory bandwidth partitioning potentially can help threads to reach their synchronization points simultaneously."


15:35 - 16:05

DVA331

Interface analysis of a mobile application for alcohol dependence/Gränssnittsanayls av en mobilapplikation mot alkoholmissbruk

Rodny Andersson
Advisor: Rikard Lindell
Examiner: Jan Gustafsson

Abstract: There are currently several types of dependences among the population, one of the most common one is alcohol dependence. Alcohol dependence burdens the individual and the society. It is therefore important to find methods for treating alcohol dependence. Several different methods are used to treat alcohol dependence and two methods are medicine and consultation. Recently a new tool in the field has been used, mobile applications. The mobile applications whose purpose is to create a behaviour change, can use several design principles from the framework Persuasive system design (PSD). In this thesis, the interface for the application Previct Task has been analysed. The purpose of the study was to design a design proposal that will increase the user’s motivation compared to the current interface. The results can be used as guidelines for how an interface for alcohol dependence can be designed. In order to develop a design proposal, the most important design principles from PSD have been identified by users and healthcare providers. The functions in the current interface have been examined by a heuristic evaluation and by the users. The result shows that the design principles Self-monitoring, Praise, Rewards and Trustworthiness were the most important ones for the users. The current interface partly meets the design principles; however, the design principles can be improved in the interface. The result shows that the application's idea is good, but the application's structure and design need to be improved in order to make its users to feel more motivated.

Sammanfattning: I dagens samhälle finns det flera typer av missbruk, ett av det vanligaste är alkoholmissbruk. Alkoholmissbruk har stora negativa effekter för individen men även för samhället. Det är därför viktigt att hitta effektiva metoder för att behandla alkoholmissbruk. Två metoder som används för att skapa en beteendeförändring hos missbrukarna är medicin och konsultation. På senare tid har ett nytt verktyg inom området börjat användas, nämligen mobilapplikationer. De mobilapplikationer vars mål är att skapa en beteendeförändring kan använda sig av designprinciper från ramverket Persuasive systemdesign (PSD). I den här studien har gränssnittet för applikationen Previct Task varit i fokus. Applikationen används som ett verktyg i kombination med att patienten har kontakt med en vårdgivare. Syftet med studien var att få fram ett designförslag som kan öka motivationen hos användarna jämfört med det nuvarande gränssnittet. Resultatet kan användas som riktlinjer för hur gränssnitt för applikationer som ska behandla alkoholmissbruk kan designas. För att få ett resultat har det nuvarande gränssnittet utvärderats genom en heuristisk utvärdering. De viktigaste designprinciperna från PSD och funktionerna i gränssnittet har även identifierats av användarna och vårdgivarna. Resultatet visar att designprinciperna Self-monitoring, Praise, Rewards och Trustworthiness var viktigast för att användarna skulle bli motiverade. Det nuvarande gränssnittet uppfyllde delvis designprinciperna, men jag kunde se att designprinciperna kunde förbättras. Slutsatsen av undersökningen var att applikationens idé var bra men att applikationens struktur och utförande behöver förbättras för att användarna ska känna sig mer motiverade.


16:10 - 16:40

DVA331

Achieving a reusable reference architecture for microservices in Cloud environments

Zacharias Leo
Advisor: Radu Dobrin, Václav Struhár
Examiner: Jan Gustafsson

Abstract: "Microservices are a new trend in application development. They allow for breaking down big monolithic applications into smaller parts that can be updated and scaled independently. However, there are still many uncertainties when it comes to the standards of the microservices, which can lead to costly and time consuming creations or migrations of system architectures. One of the more common ways of deploying microservices is through the use of containers and container orchestration platform, most commonly the open-source platform Kubernetes. In order to speed up the creation or migration it is possible to use a reference architecture that acts as a blueprint to follow when designing and implementing the architecture. Using a reference architecture will lead to more standardized architectures, which in turn are most time and cost effective.

This thesis proposes such a reference architecture to be used when designing microservice architectures. The goal of the reference architecture is to provide a product that meets the needs and expectations of companies that already use microservices or might adopt microservices in the future. In order to achieve the goal of the thesis the following research questions were answered:

What are the key expectations on non-functional requirements of companies on a microservice based architecture?
What are usually the core features implemented into reference architectures, and how can they fit in the proposed microservice reference architecture?
Which Kubernetes components should be used in the proposed reference architecture, based on the company expectations?

To answer the research questions the work was divided into three main phases. First, a questionnaire was conducted and sent out to be answered by experts in the area of microservices or system architectures. Second, literature studies were made on the state of the art and practice of reference architectures and microservice architectures. Third, studies were made on the Kubernetes components found in the Kubernetes documentation, which were evaluated and chosen depending on how well they reflected the needs of the companies.

This thesis finally proposes a reference architecture with components chosen according to the needs and expectations of the companies found from the questionnaire."