Free Sample
+ Collection

OpenNI Cookbook

Cookbook
Soroush Falahati

Learn how to write NIUI-based applications and motion-controlled games
$26.99
$44.99
RRP $26.99
RRP $44.99
eBook
Print + eBook

Want this title & more?

$21.99 p/month

Subscribe to PacktLib

Enjoy full and instant access to over 2000 books and videos – you’ll find everything you need to stay ahead of the curve and make sure you can always get the job done.

Book Details

ISBN 139781849518468
Paperback324 pages

About This Book

  • Use OpenNI for all your needs from games and application UI to low-level data processing or motion detection
  • Learn more about the Natural Interaction features of OpenNI
  • The book is useful for both beginners and professionals because it covers the most basic to advanced concepts in the OpenNi technology.
  • Full of illustrations, examples, and tips for understanding different aspects of topics, with clear step-by-step instructions to get different parts of OpenNI working for you

Who This Book Is For

If you are a beginner or a professional in NIUI and want to write serious applications or games, then this book is for you. Even OpenNI 1 and OpenNI 1.x programmers who want to move to new versions of OpenNI can use this book as a starting point.

This book uses C++ as the primary language but there are some examples in C# and Java too, so you need to have about a basic working knowledge of C or C++ for most cases.

Table of Contents

Chapter 1: Getting Started
Introduction
Downloading and installing OpenNI
Downloading and installing NiTE
Downloading and installing the Microsoft Kinect SDK
Connecting Asus Xtion and PrimeSense sensors
Connecting Microsoft Kinect
Chapter 2: OpenNI and C++
Introduction
Creating a project in Visual Studio 2010
OpenNI class and error handling
Enumerating a list of connected devices
Accessing video streams (depth/IR/RGB) and configuring them
Retrieving a list of supported video modes for depth stream
Selecting a specific device for accessing depth stream
Listening to the device connect and disconnect events
Opening an already recorded file (ONI file) instead of a device
Chapter 3: Using Low-level Data
Introduction
Configuring Visual Studio 2010 to use OpenGL
Initializing and preparing OpenGL
Reading and showing a frame from the image sensor (color/IR)
Reading and showing a frame from the depth sensor
Controlling the player when opening a device from file
Recording streams to file (ONI file)
Event-based reading of data
Chapter 4: More about Low-level Outputs
Introduction
Cropping and mirroring frames right from the buffer
Syncing image and depth sensors to read new frames from both streams at the same time
Overlaying the depth frame over the image frame
Converting the depth unit to millimetre
Retrieving the color of the nearest point without depth over color registration
Enabling/disabling auto exposure and auto white balance
Chapter 5: NiTE and User Tracking
Introduction
Getting a list of all the active users
Identifying and coloring users' pixels in depth map
Reading users' bounding boxes and center of mass
Event-based reading of users' data
Chapter 6: NiTE and Hand Tracking
Introduction
Recognizing predefined hand gestures
Tracking hands
Finding the related user ID for each hand ID
Event-based reading of hands' data
Working sample for controlling the mouse by hand
Chapter 7: NiTE and Skeleton Tracking
Introduction
Detecting a user's pose
Getting a user's skeleton joints and displaying their position in the depth map
Designing a simple pong game using skeleton tracking

What You Will Learn

  • Retrieve and use depth, vision, and audio from compatible devices
  • Get basic information about the environment
  • Recognize hands, humans, and their skeleton and track their moves
  • Customize frames right from the device itself
  • Identify basic gestures like pushing or swapping
  • Select between devices or use more than one device to read data
  • Recognize pre-defined hand gestures and detect user poses

In Detail

The release of Microsoft Kinect, then PrimeSense Sensor, and Asus Xtion opened new doors for developers to interact with users, re-design their application’s UI, and make them environment (context) aware. For this purpose, developers need a good framework which provides a complete application programming interface (API), and OpenNI is the first choice in this field. This book introduces the new version of OpenNI.

"OpenNI Cookbook" will show you how to start developing a Natural Interaction UI for your applications or games with high level APIs and at the same time access RAW data from different sensors of different hardware supported by OpenNI using low level APIs. It also deals with expanding OpenNI by writing new modules and expanding applications using different OpenNI compatible middleware, including NITE.

"OpenNI Cookbook" favors practical examples over plain theory, giving you a more hands-on experience to help you learn. OpenNI Cookbook starts with information about installing devices and retrieving RAW data from them, and then shows how to use this data in applications. You will learn how to access a device or how to read data from it and show them using OpenGL, or use middleware (especially NITE) to track and recognize users, hands, and guess the skeleton of a person in front of a device, all through examples.You also learn about more advanced aspects such as how to write a simple module or middleware for OpenNI itself.

"OpenNI Cookbook" shows you how to start and experiment with both NIUI designs and OpenNI itself using examples.

Authors

Read More