Full body and hands gestures tracking

Abstract

Integration of whole body motion and hand gesture tracking of astronauts to ERAS(European MaRs Analogue Station for Advanced Technologies Integration) virtual station. Skeleton tracking based feature extraction methods will be used for tracking whole body movements and hand gestures, which will have a visible representation in terms of the astronaut avatar moving in the virtual ERAS Station environment.

Benefits to ERAS

“By failing to prepare, you are preparing to fail.” ― Benjamin Franklin

It will help astronauts in getting familiar with the their habitat/station, the procedures to enter/leave it, the communication with other astronauts and rovers, etc. Thus preparing themselves by getting a before hand training in the Virtual environment will boost their confidence and will reduce chances of failures to great extent, ultimately resulting in increase in the success rate of the mission.

Project Details

INTRODUCTION

An idea of implementing integration of full body and hand gesture tracking mechanism is proposed after having a thorough discussion with the ERAS community. The method proposed use 3D skeleton tracking technique using a depth camera known as a Kinect sensor(Kinect Xbox 360 in this case) with the ability to approximate human poses to be captured, reconstructed and displayed 3D skeleton in the virtual scene using OPENNI, NITE Primesense and Blender game engine. The proposed technique will perform the bone joint movement detections in real time with correct position tracking and display a 3D skeleton in a virtual environment with abilities to control 3D character movements. The idea here is to dig deeper into skeleton tracking features to track whole body movements and hand gesture capture. The software should also maintain long-term robustness and quality of tracker. It is also important that the code should be less complex and more efficient. It should have more automated behavior and minimum or no boilerplate code. It should also follow the standard coding style set by the IMS(Italian Mars Society) coding guidelines.

The other important feature of the tracker software should be that, it should be sustainable long-term in order to support further future improvements. In other words, the codes and tests must be easy to modify when the core tracker code changes, to minimize the time needed to fix the code and tests after architectural changes are performed to the tracker software. This feature would allow the developers to be more confident of refactoring changes in the software itself. Following are the details of the project and the proposed plan of action.

REQUIREMENTS DURING DEVELOPMENT

Hardware Requirements 
  • Kinect Sensor(Kinect Xbox 360)
  • A modern PC/Laptop
Software Requirements 
  • OpenNI/NITE library
  • Blender game engine
  • Tango server
  • Python 2.7.x
  • Python Unit-testing framework
  • Coverage
  • Pep8
  • Pyflakes
  • Vim (IDE)

THE OUTLINE OF WORK PLAN

Skeleton Tracking will be done using Kinect sensor and OpenNI/NITE framework. Kinect sensor will generates a depth map in real time, where each pixel corresponds to an estimate of the distance between the Kinect sensor and the closest object in the scene at that pixel’s location. Based on this map, application will be developed to accurately track different parts of the human body in three dimensions.

OpenNI allows the applications to be used independently of the specific middleware and therefore allows further developing codes to interface directly with OpenNI while using the functionality from NITE Primesense Middleware. The main purpose of NITE Primesense Middleware is an image processing, which allows for both hand-point tracking and skeleton tracking. Tracking of whole skeleton can be done using this technique however main focus of the project will be on developing framework for full body motion and hand gesture tracking which can be later integrated with ERAS Virtual station. The following flow chart gives a pictorial view of working steps.

SkeletonTracking1.png

Basically, The whole work is divided into three phases :

  • Phase I  : Skeleton Tracking
  • Phase II  : Integrating tracker with Tango server and prototype development of a glue object
  • Phase III : Displaying 3D Skeleton in 3D virtual scene

Phase I : Skeleton Tracking
Under this phase comes tracking of full body movements and hand gesture capturing. RGB and depth stream data are taken from the Kinect sensor and is passed to PSDK(Prime Sensor Development Kit) for skeleton calibration.

Skeleton Calibration : Calibration is done to gain control over the controlling device.

Skeleton calibration can be done :

  • Manually, or
  • Automatically

Manual Calibration :

For manual calibration user is require to stand in front of Kinect with his whole body visible and has to stand with both hands in air(‘psi’ position) for few seconds. This process might take 10 seconds or more depending upon the position of Kinect sensor.

Automatic Calibration :

It enable NITE to start tracking user without requiring a calibration pose. It also helps to create skeleton shortly after user enters the scene. Although skeleton appears immediately but auto-calibration takes several seconds to settle at accurate measurements. Initially skeleton might be noisy and less accurate but once auto-calibration determines stable measurements the skeleton output becomes smooth and accurate.

However, Analyzing cons and limitation of both method. In the proposed application, I will be giving option to the user to choose among two given calibration method. Considering the fact that a user can go out of view only if the training session is interrupted. So we will ask the user (that will occupy always the same VR station) to do a manual calibration at the beginning of the week and then an automatic recalibration can happen every time a simulation restart in the same training rotation.

Skeleton Tracking : Once calibration is done OpenNI/NITE will start the algorithm for tracking the user’s skeleton. If the person goes out of the frame but comes in really quick, the tracking continues. However, if the person stays out of the frame for too long, Kinect recognizes that person as a new user once she/he comes back, and the calibration needs to be done again. Once advantage which we get here is that Kinect doesn’t require to see the whole body if the tracking is configured as the upper-body only.
output : NITE APIs will return the positions and orientations of the skeleton joints.

Phase II : Integrating tracker with Tango server and prototype development of a glue object’
As whatever we are doing here must be ready for supporting multi-player (a crew of 4/6 astronaut)so there will be 4/6 Kinect sensors and 4/6 computers supporting each a virtual station. The application must be able to populate each astronaut environment with the avatars of all crew members. The idea is that Tango will provide around the skeleton data of all crew members for cross visualization. Skeleton data obtained from each instances of tracker will be published to Tango server as tango parameters. A prototype will be developed for changing reference frame of the tracked data from NITE framework to blender reference frame and send it to blender framework for further processing. It is called a glue object since, it acts as a interface between the NITE and blender framework. Since blender has support for python bindings, this glue object will be created via Python.

GlueObject1.png

Phase III : Displaying 3D Skeleton in 3D virtual scene
Under this step work will be done in-order to get all skeleton(s) data from glue object and is transferred to blender framework, where 3D skeleton will be displayed in the 3D virtual scene driven by blender game engine. Basically it provides a simulation of user in virtual environment. The idea here is that 3D skeleton inside the virtual environment will mimic the same gestures/behavior which is performed by user in real world.

Deliverables

An application that tracks full body movement and hand gesture for effective control of astronaut’s avatar movement with following features.

  • application will detect the movement and display the user’s skeleton in 3D virtual environment in real time and the positions of the joints are presented accurately
  • It can detect many users’ movements simultaneously
  • Bones and joints can be displayed in 3D model in different colors with the name of user on top of head joint
  • It can display the video of RGB and depth during the user movement
  • Users can interact with 3D virtual scene with rotation and zoom functions while user can also see avatar in a variety of perspectives
  • It can display 3D virtual environment in a variety of formats(3DS and OBJ). Also, virtual environment can be adjusted without interpretation of the motion tracking
  • Proper automated test support for the application with automated unit test for each module.
  • Proper documentation on the work for developers and users

To view more detailed application checkout this link – https://wiki.mozilla.org/Abhishek/IMS_Gsoc2014Proposal

Advertisements

Selected For GSoC(Google Summer Of Code) 2014

I guess I am quite late to post this good news as I was busy with my academics and final year project. I have been selected for the GSoC (Google Summer Of Code) 2014 program as a student intern.  Google Summer of Code is a global program that offers students stipends to write code for open source projects. Google have worked with the open source community to identify and fund exciting projects for the summer.

This summer I will be working for IMS (Italian Mars Society) for their V-ERAS(Virtual European MaRs Analogue Station for Advanced Technologies Integration) project. My work includes integration of whole body motion and hand gesture tracking of astronauts to ERAS(European MaRs Analogue Station for Advanced Technologies Integration) virtual station. Skeleton tracking based feature extraction methods will be used for tracking whole body movements and hand gestures, which will have a visible representation in terms of the astronaut avatar moving in the virtual ERAS Station environment.

To know more about Google Summer of Code check this link http://www.google-melange.com/gsoc/homepage/google/gsoc2014

Reference :

[1] https://www.google-melange.com/gsoc/project/details/google/gsoc2014/abhisheksingh/5733935958982656

[2] https://wiki.mozilla.org/Abhishek/IMS_Gsoc2014Proposal

Implement a battery of unit tests for SSSD

Introduction

An idea of implementing a battery of unit tests for SSSD(System Service Security Daemon) using cmocka unit test framework is proposed after having a thorough discussion with the SSSD upstream maintainer jhrozek (Jakub Hrozek, #sssd). Actually, it is not just writing better automated test codes but a total refinement of SSSD unit-tests using the cmocka unit testing framework in such a way that it will reduce complexity of unit testing code and making it efficient and provide a good mocking framework for better testing for other developers. Following are the details of the project and the proposed plan of action.

Abstract

Implementing unit tests for SSSD modules using cmocka unit testing framework with proper refactoring, minimum boilerplates and better test coverage. The tests would focus on both covering the new features but mostly on creating test for the core SSSD features, providing developers with better confidence when writing new code

Benefits to Fedora community

  • Contributing the set of unit tests to the SSSD would greatly improve its stability long-term and would help raise confidence when pushing new SSSD versions into Fedora or other distributions.
  • Making SSSD tests less complicated and mock-based unittesting framework would certainly result into an improved testing mechanism and error handling in SSSD.
  • Improvement in the test coverage will result in improvement of code quality of the SSSD.
  • Writing unit-test will help in deeper confidence in the correct behaviour of SSSD code and eventually result in easier resolution of many of the issuses related to SSSD

Project Details

The aim of the project is not just quality assurance of SSSD but to provide a proper implementation of a Unit testing framework rather than just a proof-of-concept. It has far greater goals. SSSD is an important part of the authentication picture for Fedora and other Linux distributions. Unfortunately the current version of SSSD lacks proper unit testing framework for exercising the code which are only reachable when SSSD is connected to the network. This project deals more about writing new tests based on the cmocka framweork and complete refinement of old written SSSD tests using the check framework. The idea here is to dig deeper into testing to provide and maintain long-term robustness and quality of SSSD. It is also important that the new cmocka based tests should be less complex and more efficient. It should have more automated behavior and minimum or no boilerplate code. It should also the coding style set by SSSD coding guidelines.

The other important feature of the framework should be that it should be sustainable long-term in order to support further SSSD improvements. In other words, the tests must be easy to modify when the core SSSD code changes to minimize the time needed to fix the unit tests after architectural changes are performed to the SSSD. This feature would allow the SSSD developers to be more confident of refactoring changes in the daemon itself.

Tools Required During Development

  • the Talloc and Tevent libraries
  • Cmocka library
  • Coverage tool : lcov
  • Vim (IDE)

 

The outline of my work plans

The initial stage of my work deals with becoming familiar with SSSD and learning concepts of cmocka unit-testing framework as mentioned in plan.

The general idea for the unit tests is to cover the two most important parts:

  • retrieving user information
  • authenticating users.

The following diagram gives a pictorial representation of the core components of SSSD and how they interact. Sssdsoc.png
Basically the whole project is divided into two phases, which mimick how the SSSD itself is structured:

  • Phase I : building provider tests
  • Phase II: building responder tests

Because of the large size of the SSSD project, the unit testing framework would focus on the core SSSD features that are enabled in most, if not all, SSSD deployments. In particular, the unit tests would cover only the NSS and PAM responders, while the back end (provider) tests would cover the LDAP users and group code.

Time-line for Milestones

The project is planned to be split across following weekly phases:

[Project Week 1]

Learning the tevent library and the asynchronous model

[Project Week 2]

Learning the tevent library and the async model. Might include some experimenting and reading the code.

[Project Week 3]

Reading the current NSS responder test and augmenting the “get user by name/ID tests”

[Project Week 4]

Adding a similar test for retrieving groups as added for users in the previous step.

[Project Week 5]

Adding another test for the initgroups operation.

[Project Week 6]

Studying the PAM responder code.

[Project Week 7]

Adding a unit test that would cover the PAM responder. Only login (PAM authentication phase) can be covered.

[Project Week 8]

Learning the backend and provider internals. The current DNS update tests might be a good start.

[Project Week 9]

Creating unit tests for retrieving LDAP users. These tests would not be big by themselves, but would include code to be reused by other LDAP tests later

[Project Week 10]

Creating unit tests for storing LDAP groups without nesting (RFC2307)

[Project Week 11]

Creating unit tests for storing LDAP groups with nesting (RFC2307bis)

[Project Week 12]

An extra week to polish the work before submission

Deliverables

Better and improved test codes of SSSD with following features:

  • Tests covering NSS and PAM responders
  • Contribute to the overall code quality by uncovering issues with the unit tests
  • Less complex test infrastructure
  • More efficient testing mechanism

Unit Tests For Mozbase

Introduction

An idea of unit testing mozbase is proposed after having a good discussion with jhammel, jmaher, ctalbert (members of ateam). Actually, it is not just writing better automated test codes but a total refinement of mozbase unit-tests including unittesting framework in such a way that it will reduce complexity of code and will be efficient. Following are the details of the project and the proposed plan of action.

Abstract

Implementing unit tests for mozbase modules using unittest framework with proper refactoring, minimum boilerplates and better test coverage. Doing sanity checking, resulting in efficient test codes.

Benefits to Mozilla community

  • Implementation of mozbase tests with less complicated tests and Unittesting framework would certainly result into an improved testing mechanism and error handling in mozbase.
  • Improvement in the test coverage will result in improvement of code quality of mozbase.
  • A detailed documentation on the work to help the contributers for further improvements on the work.
  • Writing unit-test will help in deep extensibilty to the behaviour of mozbase code and eventually results in solving many of the issuses related to mozbase-testing on bug tracker

Project Details

The aim of the project is not just quality assurance of mozbase but to provide a proper implementation of Unit testing framework rather than just a POC(proof of concept). It has far greater goals. Mozbase is so important that all of Mozilla’s test harnesses use mozbase to some degree, including Talos, mochitest, reftest, Autophone, and Eideticker. But the Current version of mozbase lacks proper unit tests and has poor test coverage. This project deals more about writing new efficient tests and complete refinement of old written mozbase tests using unit testing. The idea here is to dug deeper into testing to provide and maintain robustness and quality of mozbase. It is also important that the new tests should be less complex and more efficient. It should have more automated behaviour and minimum or no boilerplates. It should also pass pep8 and pyflakes testing.

One other feature of the framework should be that it should be sustainable and support further improvements. It means that if the tests need to be implemented further because of addition or enhancement of new feature(s) or tests need to be modified after a year or something, only slight change in the fixers should be enough and an automated process would be taking care of implementation of respective fixers to the tests. Since, it would be quite troublesome to implement all the tests again and again with each new modification or new release of mozbase dependencies, this feature is quite necessary from development perspective.

Tools Required During Development

  • Python Unittesting framework
  • Coverage
  • Pep8
  • Pyflakes
  • Vim (IDE)

The outline of my work plans

Modules of the mozbase which will be tested are :

  • moznetwork
  • mozprocess
  • mozcrash
  • mozdevice
  • mozfile
  • mozhttpd
  • mozinfo
  • mozinstall
  • mozlog
  • manifestdestiny
  • mozb2g
  • mozprofile
  • mozrunner

These modules further consists of different submodules that will be unit tested.

Schedule of Deliverables

The project is planned to be split across following weekly phases:

[Test Week 1] Adding unit tests for mozinfo and moznetwork.

[Test Week 2] Writing tests for mozfile and mozhttpd

[Test Week 3] Tests cover sub-modules of mozprofile.

[Test Week 4] Continuation of tests for sub-modules of mozprofile

[Test Week 5] Creating tests for sub-modulese of mozprocess

[Test Week 6] Continuation of tests for sub-modules of mozprocess

[Test Week 7] Adding tests for mozB2g, mozcrash and mozinstall.

[Test Week 8] Writing tests for mozdestiny.

[Test Week 9] Test for mozrunner sub-modules.

[Test Week 10] Adding tests for mozdevice sub-modules.

[Test Week 11] Continuation week for mozdevice

[Test Week 12] Final week for mozdevice and to polish the work before submission

Deliverables

  • Tests covering mozdevice, mozprocess and other core modules of mozbase.
  • Contribute to the overall code quality by uncovering issues with the unit tests
  • Less complex test infrastructure
  • More efficient testing mechanism