Saturday, December 24, 2011

Candescent NUI 11936

I've created a new release of Candescent NUI. It can be downloaded here: [source] or [binary]

Here's a list with the changes.

HandDataSource Split
I've moved part of the code that was in CCT.NUI.HandTracking to the Core library. There's a new namespace CCU.NUI.Core.Shape that contains a new kind of data source: IShapeDataSource

 I've moved the convex hull and contour detection to this new data source. The process is now:
1. Clustering: Creates clusters
2. Shape processing: Creates shapes that have a convex hull and a contour
3. Hand detection: Uses shapes to detect hand data (like finger tips and center of palm)

There's a new class in town: HandDataFactory
When you want to control the setup and handling of OpenNI or the Kinect SDK yourself and not use the DataSource classes as required until now, you can create a HandDataFactory. It offers only two methods, one that creates hand data from a pointer to OpenNI depth data and the other that takes a ImageFrame returned by the KinectSDK.

var factory = new HandDataFactory();

//OpenNI
var handData = factory.Create(depthGenerator.DataPtr);

//Kinect SDK
var handData = factory.Create(imageFrame);

Other changes
- Contour points now also have a Z value
- Performance should be a bit better
- Internal refactoring

Friday, December 23, 2011

OpenNI 1.5.2.7 (unstable)

The guys at OpenNI are keeping me busy :-)

I've upgraded Candescent NUI to the newest OpenNI release 1.5.2.7 (unstable)

Downloadable here [binary] or [source]


Tuesday, December 13, 2011

OpenNI 1.4.0.2

I've tried to upgrade to the new unstable OpenNI version (1.4.0.2). The installation was okay (had to restart because OpenNI didn't find an environment variable, but then it worked).

Everything seemed to run when I started the sample project from Visual Studio. But when I executed the compiled CCT.NUI.Samples.exe I only got an exception (nullreference exception because the pointer to the depth data is IntPtr.Zero). When I catched the exception the result was only noise:


I don't know why this happend but I guessed it might have something to do with the fact that I've got a 64bit machine but installed the 32bit version (this used to work with the stable version when I set the build output to x86).

After installing the 64bit version everything seems to work now. I've commited this version to CodePlex. The published binary will now also be compiled to run on 64bit.

To upgrade OpenNI follow the instructions given here: Avin KinectSensor

Download [Source] or [Binary]

Wednesday, December 7, 2011

Kinect Hacks - End of a hype?

Almost exactly one year after the first 'Kinect Hacks' surfaced, the hype seems to be over. I've visited several sites and noticed that there are very few new things being posted.

Is it because nobody does cool stuff with the Kinect anymore? Or are there too few visitors to those sites to be worth maintaining?

I've made a list:

http://kinect-hacks.net
Last post: June 11
Second last post: June 2
Total hacks this week: 0
Inactive

http://www.kinect-hacks.co.uk/tag/kinecthacks-net-news
Last post: Sept 2
Second last post: Aug 29
Total hacks this week: 0
Inactive (this site even was hacked and nobody seems to care, visit the front page at your own risk!)

http://www.kinecthacks.nl
Last post: Nov 24
Second last post: April 19
Total hacks this week: 0
Inactive

http://kinect.dashhacks.com
Last post: Dec 2
Second last post: Nov 27
Total hacks this week: 1
Almost inactive
-> The former admin created a new site here (active): http://developkinect.com

http://kinecthacks.net
Last post: Dec 5
Second last post: Dec 5
Total hacks this week: 2
Somewhat active

http://www.kinecthacks.com
Last post: today
Second last post: Dec 5
Total hacks this week: 5
Active

Friday, December 2, 2011

Candescent NUI 11366

I've uploaded the new version of Candescent NUI to CodePlex (Binary and Source)

Changes
  • Finger Direction Detection (see details below)
  • Refactoring, improved Code Coverage

Finger Direction Detection
With this version it's possible to detect the direction the fingers point at. Each FingerPoint has a DirectionVector (normalized to a length of 1) that indicates the direction the finger is pointing at. That means you can define the following function where p is the tip of the finger, v is the direction vector and a is a scalar value:  f(a) = v*a + p
This results in the orange line (I've limited the length).

The orange dots are the base points. They're found by traversing a given quantity of points on the contour in both directions. The number of contour points to be skipped is depending on the z coordinate of the finger tip (the closer the hand is the more points are in the contour).

There is a new setting option that toggles whether the finger direction should be detected or not.




Saturday, November 26, 2011

Video Manipulation Interface

I was heavily impressed by this demonstration of a Minority Report inspired image manipulation interface by Garratt Gallagher (for those wo haven't seen the movie here's the scene that shows the interface).


Today I'm trying to take it to the next level. Here's my version of such an interface that focuses on videos. It lets you place (predefined) videos in 3D space on top of the RGB image captured by the Kinect. A simple gesture lets you time-shift the video and another closes the video.


Written in c# using Kinect, OpenNI and WPF4 (this is the very first version and there's a lot to improve...)


These are the input gestures that are currently working

  • Defining a video space with 4 fingers (2 on each hand) starts a new video inside that space
  • Opening the right hand while pointing at a video with one finger of the left hand closes the video
  • Opening the left hand puts the video into time-shift mode. Moving the right finger on the x-axis forwards or rewinds the video

The source code for the video manipulation is not online yet. The hand and finger detection can be downloaded from CodePlex: http://candescentnui.codeplex.com

Friday, November 25, 2011

Improving Finger Detection

The next version of Candescent NUI will find the direction the fingers are pointing at (2D for the moment). I just started working on this but the first tests look promising:
 

I also want to make the code more modular, so it will be possible to "plug in" different algorithms for the various steps without the need to work directly with the source code.


 
 

Tuesday, November 22, 2011

New Kinect Hardware

Microsoft has announced that there will be new Kinect hardware (including new firmware) next year when they launch the commercial platform. It's not clear whether the firmware of the current devices can be upgraded.


The most important feature for me will be the "Near Mode" that is optimized down to 50cm distance and should even work (though less well) at 40cm. Quoting Craig Eisler:
"This is one of the most requested features from the many developers and companies participating in our Kinect for Windows pilot program and folks commenting on our forums, and we’re pleased to deliver this, and more, at launch."
Thanks to those who supported my thread in the kinect forums!

Monday, November 21, 2011

Comment Comments

What do you do if the coding guidelines that apply to the project you're working on require that

- "around 25% - 50% of code should be comments"
- "every source file must have a history section at the top"
- "every method must have java doc or summary tags"
- "the ending bracket of a method must be commented with the method name"

Do you do it because you are required to? Or do you ignore that guideline knowing that you might get into trouble?
Luckily we were able to convince the right people that this makes no sense and that it's the guideline that needs to change. But unfortunately a bit too late: Today I still come across gems like the one below:

/// <summary>
/// calc rectangle with offsets
/// </summary>
/// <param name="index"></param>
/// <param name="offsetX"></param>
/// <param name="offsetY"></param>
/// <param name="left"></param>
/// <param name="top"></param>
/// <param name="width"></param>
/// <param name="height"></param>
/// <returns></returns>
private Rectangle CalculateRectangle(int index, int offsetX, int offsetY, int left, int top, int width, int height)
{
  [...]
} //CalculateRectangle

Wednesday, November 16, 2011

Enterprise Infrastructure Misconfiguration

I'm currently working as an external employee in a project for a government agency that also provides the hardware I have to work with.

This Monday I stopped the time it takes from the moment I press the start button on my laptop (running Win 7) until I'm logged in and Outlook and Visual Studio are ready and I can start doing some real actual work.

The result were shocking 16 minutes (down to 10 minutes today, but still...)

Let's do some math: 220 workdays a year * 16 minutes = 58 hours wait time, or almost 1.4 workweeks of lost time.

Why...?

Friday, November 11, 2011

Candescent NUI 10943

I've uploaded the new version of Candescent NUI to CodePlex (Binary and Source)

Changes
  • Upgrade OpenNI from 1.3.2.3 to 1.3.3.6
  • Upgrade Kinect SDK from 1.0.0.11 to Beta 2
  • A lot of internal refactoring (but also including breaking changes on the interface)
  • Some namespaces have changed
  • Added (more) tests
  • Source code for the Test Data Collector WPF application
  • StartMenu must be configured to use either OpenNI or Kinect SDK (see app.config)
If something stopped working for you, please write to info@candescent.ch  I'll try to fix it over the weekend.


Thursday, November 10, 2011

#region is evil

Today at work I came across this piece of code. Let's focus on the #region part.

public void Validate(Guid instanceId, Guid ruleSetId)
{
    [code here]
#region logging
        if (logger.IsInfoEnabled)
            logger.InfoFormat("Excecuting validation for instance '{0}' with ruleset '{1}'", instanceId, ruleSetId);


        if (logger.IsDebugEnabled)
            logger.DebugFormat("Validation script for instance '{0}'':\r\n{1}", instanceId, script);
#endregion
    [more code here]
}

When collapsed it looks like this:
    [code here]
+ "logging"
    [more code here]

We could improve it a bit by using a better region name:
    [code here]
+ "Log Validation Execution"
    [more code here]

But still: When the region is collapsed we don't see what's happening. It might be anything, because the code has access to all method and instance scoped variables. If the region is expanded we get lost in details.

What I would do:
  1. Move all the logging code to a seperate class (say ValidationLogger)
  2. Replace the #region with a single method call:
    [code here]
    log.LogValidationExecution(instanceId, ruleSetId);
    [more code here]

The advantage is that you see what happens (a method call) and what information will be used to log (the passed parameters) and it's also just one line.

Please, think twice before using the #region tag. In many cases it's a strong indicator that you need to refactor your code!


Monday, November 7, 2011

Testing the Code

I've run some code coverage analysis and here's the result. The basic test that loads a saved frame and then runs the hand detection has the following coverage: 

Code Coverage in Version 9489

CCT.NUI.Core                      38%
CCT.NUI.HandTracking         68%

The next version will have at least the following coverage

CCT.NUI.Core                      71%
CCT.NUI.HandTracking         72%

I reorganized some namespaces and also tampered with the interface, so the new version will require users to update their code.

I've also migrated to the newest Kinect SDK Beta 2. But because it still does not return depth values < 800mm it's not really an alternative to OpenNI.

I hope to be able to release the new version over the next weekend!

Edit (09 Nov 2011):
Let's see how high I can get it. Current values:
CCT.NUI.Core                      80%
CCT.NUI.HandTracking         72%

I won't reach 100% coverage in the core library because I can't test OpenNI and Kinect at the same time. Maybe I'll put the interface code in seperate assemblies (like CCT.NUI.OpenNI and CCT.NUI.SDK). This would also have the advantage that only the required dependencies are referenced and the other can be ignored.

Saturday, November 5, 2011

New Test Data File Format

In the first version of the Test Data Collector I've used .NET binary serialization. This has turned out to be too inflexible.

v2 of the TestDataCollector saves the data as XML. The depth frame is encoded as base 64 byte array inside the XML. This results in a larger file size, but the hand data is human readable and no type information is saved within the file.

v2 can still open the old files. They can then be saved in the new format.

I've updated the download on CodePlex.

Warning: v3 will not be able to read the old binary tfrm files anymore, you have to use v2 to convert them!

Friday, November 4, 2011

Kinect SDK Beta 2

Microsoft has released the new version of the KinectSDK Beta 2:

http://www.microsoft.com/download/en/details.aspx?id=27876

The changelog says nothing about the minimum depth threshold, is it still 850mm?

Edit:
Eddy Escardo-Raffo wrote in the KinectSDK forum on October 24, 2011 that they have prioritized the request to make the minimum depth value configurable. The forum post ist here.

So maybe it'll be possible in the next version.

Edit 2:
I've tested the Beta 2 on my Laptop and it did not return any depth value below 800mm.


PS. Here's a interview with Rob Relyea about the new Beta. The interviewer might actually have mentioned Candescent NUI on CodePlex at the very end, though he didn't say a name :-)

Test Data Collector

I've published a first version of the TestDataCollector today. The program can be downloaded here (binary only for the moment):
http://candescentnui.codeplex.com/releases/view/76254

It currently only works with OpenNI.

When I have collected a first batch of data I'll define an accuracy metric that compares the human set points to those that the algoritm finds.



How to collect test data:

1. Click "Start Depth Source", this will start the depth data stream and show it on the screen.
2. Click "Capture Frame" to capture a single frame out of the depth stream (or click "Capture Frame Delayed", then the frame is captured after 3 seconds).
3. Click one of the frames you captured in the frames bar
4. Add as many hands as are presented to the device
5. Mark the center of the palm by clicking the button under the corresponding hand box and then on the image at the right location.
6. Mark the fingers by clicking the button and then the fingers on the image. Try to mark the point on the edge as shown in the image (but make sure it's still on the hand, look at the z position). 
 7. Save the frame by clicking "Save Test Frame..."

If you would like to contribute to the test data collection, please send me your captured frames in a zip file to info@candescent.ch

I have to add the following:

Please note! By sending me data generated with the test data collector you agree to transfer your copyright of that data to me. This allows me to publish it in a test data collection under any license. This is necessary so I can add it to CodePlex and / or to create a commercial version somewhen in the future (no plans yet).

The data contains no other information than the depth frame and the annotated hand data (no color image). No data about you or your computer is collected.

If you don't agree, please do not use the test data collector.

Monday, October 31, 2011

Test Data Collector - Preview

Here's a preview of what I've been working on over the weekend. It's a tool that lets you take snapshots of the depth data stream. You can then add additional data that describes the hand (center of palm and finger tips).


This will allow the definition of an accuracy metric that calculates how good the algorithm detects the fingers and center of palm.

Benefits
- Easier problem identification (problems in saved frames can be reproduced).
- Measuring improvements of the algorithm
- Automatic determination of settings

I've decided to create a seperate application instead of including it in the samples project.


I'll need your help building a collection of such annotated frames!

Wednesday, October 26, 2011

Test Data

After more than one month of not working with the Kinect, I've plugged it back in today! :-) I've had some ideas and I'm eager to implement them.

Meanwhile I was experimenting with TDD (test first, acceptance tests...) on some other greenfield project of mine and it worked out nicely. I'll try to write more tests for Candescent NUI too.

Here's one of the planned new features:
You can save single (or maybe whole sequences) of depth frames to the disk. These can then be loaded again and used to test the algorithm. The idea: No more guessing if a code or setting change did improve the algorithm, just run the tests and measure the accuracy!


A lot of the code is already on CodePlex, the only two tests that are checked in use it. There are only two still depth frames checked in and there is no UI for it yet. Now I want to make this more accessible to build a set of test data (all kind of hands, your help will be needed!)

Example:
depth frame 1: Left hand with 3 fingers
depth frame 2: Left hand with 2 fingers, right hand with 5 fingers
depth frame 3: No hand
...

Ideally the location of the finger points and center of palm will be manually marked for each test frame. Like this the distances of the points that the algorithm finds can be compared with the expected points.

Saturday, October 22, 2011

HoloDesk and OmniTouch

Microsoft (Research) keeps working on new stuff that uses the Kinect. Here's a video on YouTube that features a HoloDesk.



The second video by Chris Harrison is an example of a wearable interface. Of course the depth camera and projector will have to shrink a bit. It would be connected to a smartphone you have in your pocket.

 

Wednesday, October 5, 2011

Vacation

I've been on vacation during the last two weeks, sorry to those who had to wait for an answer.

Tuesday, September 13, 2011

Talk at .NET User Group Berne Meeting

Yesterday I gave a short talk about Kinect development at the .NET User Group Meeting in Berne. (I was the warm-up act for Golo Rodens talk about ADF).


A short summary plus pictures and the slides can be found here (in German).

Monday, September 12, 2011

OpenNI Arena

Yesterday Candescent Start Menu was added as Featured App to OpenNI Arena!

Visit OpenNI Arena for more information and other OpenNI applications.

Wednesday, August 17, 2011

Candescent NUI (9237)

For those who were waiting on a binary version, it's now available for download here.

For details please read my last post.

Sunday, August 14, 2011

KinectMultiTouchDevice for WPF 4

Today I've published the KinectMultiTouchDevice class on CodePlex. It allows you to leverage the multi-touch support added by Microsoft to WPF 4 and combining it with Kinect! (Here's an introduction blog post that covers much of multi-touch in WPF 4).

Multi-touch with Kinect and WPF 4

The update comes with two simple WPF samples in a new Candescent NUI WPF Samples project.
It also works with the WPF samples of Microsoft Surface 2.0 SDK, but you'll have to add the KinectTouchDevice for each demo project manually.

Screen capture of the multi-touch sample project

All fingers of both hands can be used simultaneously. Though it rarely makes sense to use more than 1 or two per hand.

Please note: I've only published the source code, I'll add a binary release soon. The sample only runs with OpenNI for the moment.

Technical details
The class CCT.NUI.Touch.KinectMultiTouchDevice can be found in the CCT.NUI.HandTracking.dll.

In a WPF 4 window it can be initialized like this (in the loaded event, you must should not add this code in the constructor).

 this.factory = new OpenNIDataSourceFactory("config.xml");  
 var handDataSource = new HandDataSource(this.factory.CreateClusterDataSource());  
 this.multiTouchDevice = new KinectMultiTouchDevice(handDataSource, this);  
 handDataSource.Start();

The device identifiers are created the following way: The ID is always a 2 digit decimal number where the first number is the hand ID and the second number the finger ID. Finger IDs are assigned in order of appearance and do not identify a specific finger (for example index).

There are some glitches here and there, I'll keep working on it. For now I'm looking forward to see what you'll do with it!

Other changes in changeset 9161
  • New mapping algorithm
    Hands and fingers are mapped between two frames and keep the same ID until they disappear. IDs are reused! That means if Hand 1 is removed from view the next hand that appears will get ID 1 again. The same is true for the fingers.
  • Fingers and Hands implement the ILocatable interface
  • HandData.Id is now int instead of guid
  • Finger points get IDs. They are assigned in order of appearance, so you can't assume that 1 is always the thumb. Also the IDs are reused.
  • Option to show depth view in Candescent StartMenu
  • Exit button in Candescent Start Menu

Sunday, August 7, 2011

Kinect Multitouch Device for WPF 4 (Preview)

I'm working on a KinectMultiTouchDevice class which will create WPF 4 TouchDevices for touch input.

Here's a code snippet that shows how you will be able to intialize the device (inside the load method of a WPF 4 window):

 this.factory = new OpenNIDataSourceFactory("config.xml");  
 var handDataSource = new HandDataSource(this.factory.CreateClusterDataSource());  
 this.multiTouchDevice = new KinectMultiTouchDevice(handDataSource, this);  
 handDataSource.Start();  

Then you can use the following events of the UIElement class:
  • public event EventHandler<TouchEventArgs> TouchEnter;
  • public event EventHandler<TouchEventArgs> TouchLeave;
  • public event EventHandler<TouchEventArgs> TouchMove;
  • public event EventHandler<TouchEventArgs> TouchUp;
video
This is a small sample that draws lines between two touch move events. It's based on WpfTouchEventsSample that can be found here


video
In this video multiple finger devices are used simultaneously (the lines are always drawn between the last two reported points)

I'm also working on a more sophisticated sample that I can share together with the KinectMultiTouchDevice class.

    Wednesday, August 3, 2011

    OpenNI Upgrade

    With changeset 8918 I've updated OpenNI to the newest unstable version (1.3.2.3). All the changes are internal to Candescent NUI and don't influence it's public interface.

    I've also updated to the newest driver Avin2 SensorKinect. There you can find instructions on the installation order (OpenNI, then the driver, then NITE).

    I've installed the 32-bit version on my 64-bit Windows PC without problems (this time).

    Saturday, July 30, 2011

    Gallery?

    I thought about creating a gallery page that showcases uses of Candescent NUI. I'd like to see what you do with it!

    If you think it's a good idea and would like to show what you're implementing, I'd welcome you to send an e-mail to: info@candescent.ch

    What you should include:
    • Short description
    • Link to a video or screenshot(s)
    • Whether you would like your entry with your name or anonymous
    • Optional: Link to your own project page

      Sunday, July 24, 2011

      Thoughts about Kinect interfaces

      I've watched lots of Kinect 'hack' videos and I've asked myself in what way they make interaction with computers [insert positive adjective here]. Is it easier, faster, more comfortable, more practical than traditional input with mouse and keyboard (or touch)? I think in most cases I find the answer is no. If you take away the coolness factor of controlling something without touching anything and look at it from a neutral perspective the interfaces are slower, less accurate and often uncomfortable (Stretching out arms for hours? Having to hover a button for x seconds to press it?)

      Kinect is good for games in which you play a character that moves in a small area (like dance, boxing, fitness). It's good for places where you should not have to touch anything (clean room, wearing gloves) and maybe for public showcases if you don't want to use a touch screen. But it won't replace mice, keyboards and touchscreens.

      There are a lot of videos featuring Kinect interfaces for games. These are in most cases not natural user interfaces. Yes, you don't need a device to control your character. But it's not natural to use your arms to look or move around. In reality you use your hands to manipulate things, turn your head or body to look around and use your legs to move.

      I think this is going in the right direction:
      SUVIX demo - VR environnment using Kinect 

      Now imagine combining 3D glasses with skeleton tracking and a 2D treadmill (like this). Star Trek Holodeck anyone? The only things still missing are force fields, that let you feel what you interact with ;-)

      Sunday, July 17, 2011

      Candescent NUI Release Notes (8638)

      There is a new release of Candescent NUI on CodePlex.

      Changes
      1. Replaced xn.Point3D with CCT.NUI.Core.Point
        This makes most of the code independent from OpenNI
      2. Introduced an optional maximal cluster depth setting
        The default ist 150mm. The hand should not take up more than 15cm depth so the rest (of the arm) is filtered.
      3. Fixed a bug that prevented the convex hull from drawing
      4. Improved the finger detection algorithm (less false positives)
      5. Increased speed
      • Clustering 7ms -> 6ms
        (15% improvement)
      • Hand and finger detection 16ms -> 12ms
        (25% improvement)
      Get the source or binary version.

        Tuesday, July 12, 2011

        Candescent NUI Release Notes (8511)

        Finger tip depth information
        Finger tips now also have a z value indicating the distance from the Kinect in mm. I've already got some ideas how to use this!

        Finger depth
         Configuration
        I've added three configuration "templates": default, fast and accurate. I'll tweak these settings, but for now I've just picked some values which I thought reasonable.

        Settings
        I've run some quick performance tests:
        Fast
        Clustering     6.8ms
        Hands        11.3ms

        Default
        Clustering     7.2ms
        Hands        16.2ms

        Accurate
        Clustering     8.1ms
        Hands        32.3ms

        ... and I've fixed a dispose problem that lead to performance issues in the sample app when the hand and finger detection button was clicked more than once.

        Get the release here, the source code is here.

        Skeleton tracking combined with finger detection - problems

        Some of my readers have proposed to combine skeleton tracking with finger detection but I haven't tried this yet. And here is an example why: I've found this video (already 2 months old) in which Patricio Gonzalez demonstrates a combination of the two techniques.

        The problem is that for the skeleton tracking to work, you must be far enough away from the Kinect. And for the finger detection to work you have to be close enough. This defines a range in which both algorithms work, but in my opinion that range is empty at the current resolution of 640x480. I guess when the depth cameras provide a resolution of 1024x768 plus, we will quickly see this catch on.

        Thursday, July 7, 2011

        Hand Detection Performance

        I've worked on the performance of the hand detection, here is a summary:

        When I started the optimizations the detection of a hand took around 22 - 23ms (which is a lot if you only have time for 33ms per frame). The distribution was like this:

        Finding the contour
        Used to take 12ms per frame (so this accounts for half of the time used). I've reduced this to 9ms and think there is still potential.

        Center of the palm
        Used to take 7ms per frame. Now it takes 3.5ms and is also more accurate.

        Mapping the hands in the new frame to the hands in the old frame
        3.5ms per frame, I didn't try to optimize this yet.

        Finding the convex hull
        < 1 ms per frame, I didn't try to optimize this yet.

        Detecting the fingers
        < 1 ms per frame, I didn't try to optimize this yet.

        With the improvements I've got the total time down to 16ms (around 30% less).


        I've introduced TPL (Task Parallel Library) to take advantage of multi core CPUs. Example from PalmFinder.cs:

        Parallel.For(0, candidates.Count, (index) =>
        {
            distances[index] = FindMaxDistance(contour, candidates[index]);
        });

        This will start multiple threads (the actual number depends on the number of cores that are available) to calculate the distances for the candidate points.

        I've also made some variables that were used in loops method scoped instead of class scoped, this also increased speed.


        This was measured on my Intel Core i7 860  (this is a quad core CPU). Speed improvements on two or single core CPUs might be smaller.

        Edit: The new version is on CodePlex: 8454

        Sunday, July 3, 2011

        Image Manipulation on CodePlex

        By popular request ;-) I've put the image manipulation code on CodePlex (change set 8391). It'd need some more refactoring, but I think as a sample it's good enough.

        It requires some practice by the user, especially scaling with two fingers (as you can see in the video).

         
        The movement isn't very smooth. That's because the center of the palm jumps around a bit. It is a trade-off vs. performance. I'll try to improve this.

        Edit: I've tried it with the SDK, but the depth limitation is making it almost impossible to work with this sample. I hope Microsoft will add an option so values closer than 850mm can be used soon. And I've found a little problem, the standard depth threshold was used in changeset 8391, so it would not work at all; I've fixed that in 8402.

        Saturday, June 25, 2011

        StartMenu on CodePlex

        With change set 8122 I've added the source code for the Candescent StartMenu WPF application to CodePlex.

        To show the menu, present your open hand to the kinect at a distance of around 0.5 - 1 meters. To close the menu again, just close the hand. To start an application, stretch out one finger, move it over the icon (which gets highlighted), then close the hand quickly, so no finger is visible anymore.

        If you're using the Kinect SDK the distance is limited to 850 to 950mm.
        Open Start Menu

        There is currently only one item in the start menu by default (Windows Explorer). You can configure programs or files to be shown in the Settings View. To open it, right click on the task bar icon and select "Settings..." or double click the icon.


        Start Menu Settings

        The configuration is stored in the file "menu_config.csv", you can also edit this file manually. Please don't change the first line. All other lines have to be in the format <Name>;<Path>.

        To stop the program, press escape while the start menu is visible or right click the task bar icon ans select "Close".

        The icons in the settings view are from "Crystal Project Icons"
        Author:    Everaldo Coelho
        Site:    http://www.everaldo.com
        Contact: everaldo@everaldo.com
        License: LGPL

        Thursday, June 23, 2011

        Minimum Depth in the SDK

        In this post in the SDK Beta Forum I asked Microsoft  if there is an artificial depth minimum cap in their code. There is (at around 800 - 850mm) and currently there is no option to change it.

        I think the depth range should be configurable. If you have the same opinion, please to go to my post and support my question if they could make this depth minimum confifgurable - thanks a lot!

        Saturday, June 18, 2011

        Candescent NUI (7665)

        I've put a new version on CodePlex (source and binary) and changed the official release to this version.

        Please read the new quick start, because the creation of data sources has changed! http://candescentnui.codeplex.com/documentation

        For those who use Candescent NUI and want to try the SDK: I'd recommend against installing it at the moment. For this project there are no advantages, only disadvantages. For example the SDK returns no depth data closer than 85cm. I haven't compared performance yet.

        There are advantages over OpenNI in general:
        - Easier installation
        - Audio data is supported
        - You can control the motor
        - Personally I like that the skeleton tracking does not need the Y calibration pose

        Friday, June 17, 2011

        Candescent NUI with Microsoft Kinect SDK

        With change set 7506 Candescent NUI now supports both OpenNI and Microsoft Kinect SDK.

        You can choose to use OpenNI or Kinect SDK. Selecting the one you don't have installed will result in an exception.

        Warning 1!  It seems like the SDK returns depth data only between 850mm - 4000mm distance. The default range for the clustering is set to 500mm - 800mm. The algorithm works best at this range. You can try to use it at 850mm - 1000mm, but it's not very reliable.

        I don't know yet whether there is an option to get the SDK to return values closer than 85cm (the device clearly can do it with OpenNI).

        Warning 2! This was a fast checkin, I will have to refactor quite a lot (did some copy - paste to get it up faster, shame on me).

        Thursday, June 16, 2011

        Microsoft Kinect SDK is out!

        Microsoft has finally released its Kinect SDK.

        Guess it is going to be a busy day!

        (Has anyone tried to install it along with OpenNI, do they bite each other?)

        Update: As expected, the SDK and PrimeSense drivers don't get along, so I can't recommend installing the SDK when you still want to use OpenNI. I'll try to extend my project so that both OpenNI and the Microsoft SDK are supported, but this might take a while.

        Wednesday, June 15, 2011

        Multiple OpenNI Generators

        Anonymous asked in this comment here how you could get both depth and rgb image synchronously.

        I've made a quick example, you can download it here. I haven't had time to test it, so I'd be happy to receive feedback if it works for you. If it does, I will add those two classes in the next change set.

        Tuesday, June 14, 2011

        Configurable Start Menu


        There is a new (binary) version of the start menu available at:


        Changes 
        • There is now a task bar icon
          Start Menu Icon
          Right click to bring ip the context menu
          • I've added a user interface to configure the default menu

          Menu Settings

          This is actually the first WPF window I've built. I'm a WinForms veteran and I'm finding it sometimes hard to get my head around how WPF works. But I begin to like it, despite it still lacks some features that were present in WinForms.

            Monday, June 13, 2011

            Priorities

            Because my time budget is limited, I'd like to know what you would like to see me working on with priority:
            • 1 Publish the image manipulation part
            • 1 Publish the start menu code
            • 1 Detect finger direction (2D)   
            • 1 Fix performance issues
            • 0 Try to improve the algoritm (Making it more reliable, use skin detection...)
            • ...?

            Friday, June 10, 2011

            Binary Version of the Start Menu

            I've added a binary version of the Start Menu preview to CodePlex. I'm still working on the code (I've got a lot of ideas!), so the source is coming later.

             http://candescentnui.codeplex.com/releases/view/68088


            There is currently only one item in the start menu by default (Windows Explorer). And there is no user interface for configuration yet, but you can add programs yourself by adding lines to the file menu_config.csv.

            Please don't change the first line. Lines for programs have the following format: [Name];[Path]

            Default
            Explorer;c:\windows\explorer.exe
            MyApp;c:\...\myapp.exe

            To show the menu, present your open hand to the kinect at a distance of around 0.5 - 1 meter. To close it again, just close the hand. To start an application, stretch out one finger, move it over the icon, then close the hand quickly, so no finger is visible anymore.

            To stop the application, press escape while the menu is shown.



            Side note: Maybe you've noticed the new PayPal donate button on the right side of the blog. I've decided to  publish the quick start menu (and maybe more) applications for free as open source, but added the donate button in return. So if you would have bought the program, you are welcome to donate a small amount to help me cover expenses.

            Friday, June 3, 2011

            Gesture Controlled Start Menu (Preview)

            This is a preview of a quick start menu that I'm implementing. It's based on the current hand / finger detection code that is available on CodePlex. The idea is that you can configure applications and / or files and quickly start them with a simple gesture. The menu is shown when the full hand is visible, then a menu entry can be selected with one finger. When the hand is closed while an icon is selected the program / file gets started.

            This application is written in C# 4.0 and uses WPF.


            Upgrading OpenNI failed

            Yesterday I tried to upgrade to the newest OpenNI release. Long story short: It failed and I had to revert to v1.0.0.25.

            First I uninstalled all old drivers + OpenNI, NITE and PrimeSensor. Then I installed the newest driver from
            https://github.com/avin2/SensorKinect

            I'm running a 64bit Windows 7, so I was happy to see that there are now 64bit versions of OpenNI, NITE and PrimeSensor. First I downloaded the stable versions (v1.1.0.38) and installed it according to this guide. Everything looked fine (in the device manager), but when I tried to start any sample I got this error message:

            "OpenNI library can't find any module!"

            I tried everything twice, including the unstable versions and the 32bit versions, but I couldn't get it working, so now I'm back at OpenNI v1.0.0.25.


            Has anyone got the newest release running on a 64bit Windows 7 machine?

            Monday, May 23, 2011

            Change Set 5994

            There is a new version of the source code available: 5994

            - Added a build script (execute with build.bat, you might have to adjust the path)
            - New property for HandDataSourceSettings: MinimalPointsInContour
            - DetectCenterOfPalm = true by default
            - Increased the speed of the palm finder a bit

            (I did not publish a new binary release, I won't do that for every check in)

            Saturday, May 14, 2011

            Testing Hand and Finger Detection

            Until now the project source code on CodePlex was missing an important part: Tests!

            In the newest commit (5603) I've added two integration tests that cover the whole production chain (but do not rely on OpenNI to provide the data) from depth data to clustering to hand detection.

            Test 1: No hand is visible
             
            Test 2: A hand with 5 fingers must be detected

            Friday, May 13, 2011

            Quick Start

            I've added a small quick start documentation to the CodePlex page:

            http://candescentnui.codeplex.com/documentation

            Quickstart

            First of all you need to create an OpenNIFacade. This object is used to interact with the OpenNI Framework (www.openni.org).

            var facade = new OpenNIFacade("config.xml");
            

            A sample config xml is checked in and must specify at least one Image and one Depth Node.

            How to create a cluster data source

            The next step is to create a cluster data source. But first you need an DepthDataSource (actually an IDepthPointerDataSource, because it returns only the pointer to where the depth data is located in memory).

            var depthDataSource = new DepthPointerDataSource(facade.GetDepthGenerator());
            

            With this data source you can create the cluster data source:

            var clusterDataSource = new ClusterDataSource(depthDataSource);
            

            This code will use the default setting values. You can also pass in your own settings object:

            var clusterDataSource = new ClusterDataSource(depthDataSource,
                new ClusterDataSourceSettings());
            

            To get notified after new data is available, you can register the NewDataAvailable event:

            clusterDataSource.NewDataAvailable +=
                new NewDataHandler<ClusterData>(clusterDataSource_NewDataAvailable);
            
            ...
            
            void clusterDataSource_NewDataAvailable(ClusterData data)
            {
                for(int index = 0;index < data.Count;index++) 
                {
                    var cluster = data.Clusters[index];
                    Console.WriteLine(string.Format("Cluster {0}: {1} / {2}",
                        index, cluster.X, cluster.Y));
                }
            }

            How to create a hand data source

            To create a hand data source you first need an IClusterDataSource (see previous chapter).

            var clusterDataSource = ...
            
            var handDataSource = new HandDataSource(clusterDataSource);
            handDataSource.NewDataAvailable +=
                new NewDataHandler<HandCollection>(handDataSource_NewDataAvailable);
            
            ...
            void handDataSource_NewDataAvailable(HandCollection data)
            {
                for (int index = 0; index < data.Count; index++) 
                {
                    var hand = data.Hands[index];
                    Console.WriteLine(string.Format("Fingers on hand {0}: {1}",
                        index, hand.FingerPointCount));
                }
            }
            

            Sunday, May 8, 2011

            Finger Detection on CodePlex

            Over the weekend I decided to release an initial version of the finger detection algorithm. You are welcome to download the newest version from CodePlex (http://candescentnui.codeplex.com).

            video

            I'd be happy to receive feedback to see what you use it for!

            Please note: It is not as well refactored as the custering part and I'm still working on it.

            Some restrictions:
            • Finger points only have 2D coordinates (z is always 0)
            • Finger points are not ordered. It can happen that they swap position in the array between frames
            • There can be false positives for the finger points
            • Detecting the center of the palm is currently too slow, it is disabled by default
            Guidance on how to position your hand
            It works best if you position your hand flat to the viewing angle of the kinect. It's also helpful if not too much of the arm is within the processed depth segment; it decreases the chance of false positives.

             
            Correct hand position

            The hand is too close
            The hand is too far or the cluster is not compact
            False positive in the lower left corner
            Properties
            There is a new settings window that allows you to change settings while the program is running.
            Settings window