Our 2016 Projects under the Python Software Foundation:
Improving the Step Recognition Algorithm for V-ERAS
Virtual ERAS (V-ERAS) forms a salient part of European Mars Analog Station (ERAS) for Italian Mars Society (IMS).
The immersive VR Simulation of V-ERAS allows users to interact with a simulated Martian environment using Aldebran VSS Motivity, Oculus Rift and Microsoft Kinect. Motivity is a passive omnidirectional treadmill, so users’ steps are not in real dimensions. Therefore, the configuration needs to include an accurate and robust algorithm for estimating users’ steps, to be reproduced in the V-ERAS.
In the V-ERAS station simulation, the data used for the Step Recognition Algorithm with the Microsoft Kinect are the skeletal joints. They are recognized by means of the Skeletal Tracking implemented in Microsoft Kinect SDK (1.8).
However, there are two main issues with the current settings, and the goal of my project is to rectify and overcome them:
- Enhance the accuracy of the current step recognition algorithm.
- Improve the recognition of feet joints.
IMPLEMENTATION OF AN INTERACTIVE HEADS UP DISPLAY FOR OCULUS RIFT FOR THE V-ERAS SIMULATION ENVIRONMENT
The project aims to develop an interactive Heads up Display (HUD) on the OCULUS RIFT for the V-ERAS Simulation. Once the interactive semi-transparent interface is implemented on the existing V-ERAS simulation environment it should display data from the Habitat Monitoring Client. The Habitat Monitoring Client GUI is already in place. It can interface with any general Tango Device Server. A portion of the project will aim at making the HUD highly interactive and recognize touchless hand and body gestures through Leap Motion Controller.
Integration of Unity Game scene with the existing pyKinect and emulate a moving skeleton based of the movements tracked by the Kinect sensors
The aim of this project will be to provide a replacement of the existing blender based simulations with the unity game engine.