Project groups


Project 1: TimeReferee

Client: Anders Florinus, Västerås Tidtagning

Project group:


Project description:

Västerås Tidtagning provides timing equipment for swimming competition. In their efforts to raise the quality of swimming competitions in Sweden, they would like to have an iPad app (called "TimeReferee") that allows both main and auxiliar referees to register disqualifications through their portable devices during a swimming competition. "TimeReferee" will consist of two parts: an application for iPad and a server application for Windows PCs.


Project 2: Virtual workspaces

Client: Henrik Jonsson, Etteplan

Project group:


Initial project description:

The expected outcome of the project is a web-based tool that allows the creation of virtual workspaces where a user can work with (view, edit, add and delete) source code and perhaps light-weight models (drawing boxes, text and connection) from any computer, tablet and even smart phone. The content of a workspace should be maintained as a version control system. Therefore, the user should be able to add, delete, modify any number of artifacts before committing it to a repository. Virtual workspaces should be sharable with other users and check-in history should be available for each item in a workspace.


Project 3: Work orders and time report

Client: Peter Valfridsson, Byggwalle AB

Project group:


Initial project description:

The goal of this project is to develop a system for managing work orders and time reports for project-based work within the construction industry. The work orders should be created in a web-based administration system and sent to an Android application where the project is presented in a simple but clear manner. From the Android app side it should be possible to report time and add work items for each order/project.


Project 4: Interconnected Kinect devices - A 360 View

Client: Afshin Ameri, MDH

Project group:


Initial project description:

The aim of the project is to track peoples' movements, actions and body gestures through the use of 6 (or more) Microsoft Kinect sensors, giving the system a 360 degrees view of an environment. The application should also have support for Kinect Zoom lenses that might be equipped on Kinect sensors. A demo program is required that can visualize the environment from a top view using the Kinect sensors. The demo program should be able to show the location of each individual as well as their actions and gestures (jumping, sitting, waving hands, etc). The main issue to be addressed is to keep track of an individual as they move along the field of view of different Kinects without categorizing them as new individuals.