Community Core Vision (or CCV) is a computer vision application developed for multi-touch tables (like Microsoft Surface). It uses OpenFrameworks as the "creative" framework and OpenGL for rendering the GUI. And, of course, OpenCV for the vision part.
The application is intended for use with touch tables. How? Well, touch tables have lasers the produce infrared light on the surface of the table. You you put your finger on the table, you block this light. In fact, you reflect this light to the bottom side of the table. This is captured by a high speed infrared camera. The application is supposed to receive this reflected-infrared-image.
You have a few filters that you can tweak to track fingers accurately. You can subtract backgrounds, blur the image, do a high pass filter and amplify the result. The results look something like this:
Once you have an accurate track, you can transfer the position of each finger through TUIO, Flash XML or Binary TCP. You can connect to this application and receive information about each finger (ID, position, displacement and acceleration).
Your application can then take intelligent decisions based on this data.
Instead of touch tables, we can use this to track objects. But the background subtraction makes this application very versatile. You can detect pretty much anything. Here's a sample track I did on myself:
See how useful the application can be? Here's a list of problems that this application just might solve:
Try out Community Core Vision. It might help you solve the problem you're working on. Try it out. And leave a comment or let me know if you come up with an interesting use of this app!