Building flood emergency scenarios in augmented reality

More Info
expand_more

Abstract

To find out if AR can be used for building flood scenarios in an application that uses optical see-through glasses the problem was first split up into different sub problems: user localisation, world creation and water level tracking. User localisation allows the user to walk around in the scenario. This is done using a combination of marker based tracking and IMU data. The location of the user is calculated by looking at the absolute position of the marker and the relative distance to it. Markers are the main source for calculating the position of the user. However, when no marker is visible, the IMU is a more prominent source. The acceleration and velocity are combined to calculate displacement over a time span, which can be combined with the last known position of the user to calculate the new position. Due to the fact that multiple sources can indicate different user positions, the predictions get combined in a Particle Filter, which calculates the most likely user position by looking at the given predictions and the standard deviations of those measurements. The final result gets smoothed to remove outliers. The result is a steady current position of the user which allows the user to walk around. To allow the user to build a scenario which can be flooded a combination of buttons and hand gestures is used to place and modify objects in the scene. The user can select and place objects such as buildings, street lanterns and garbage bins to create a world. Once objects have been placed they can be modified in the following ways: the object can be scaled, making it larger or smaller, the object can be rotated and finally the user can relocate objects. Furthermore, the user is able to save and load created worlds. To simulate a flood scenario two stepswere required to be completed. Firstly, an invisible plane was added to the virtual world which hides points that are below the plane. Secondly, the water level of the camera feed has to be tracked. Blocks were painted below the marker which were used to track the water level. This was done with an algorithm that uses the input of the marker’s location in the frame and data from the blocks and using that information calculated the water level. The steps of the algorithm are gray scale conversion, rotation, cropping, blurring, segmentation and counting segmented blocks respectively. Future work that can be done is a different, more accurate algorithm for marker tracking. Secondly, the different algorithms for the steps of the Particle Filter can be changed and improved. Finally, when a different algorithm is implemented for marker tracking, the algorithm for water level tracking can be combined with the rest of the project, allowing real-time flood simulation. It is recommended that the Meta SDK is best avoided because, despite the fact that is has a lot to offer, is not suited for an application in this field. Having solved the three sub problems resulted in the answer to the question can AR be used for building flood scenarios using optical see-through. The conclusion is that with the current state of AR, AR is not yet suited for this. The field of view on which objects can be displayed is too small which does not give a realistic visualisation of objects bigger than your hand. Secondly, calibrating the glasses properly for every user is something that still is and still has to be improved since not all users experience the projected world as an addition to the real world. Consequently, the virtual world looks like a floating screen instead of an addition to the real world.